Search results for: bus generalized cost
6628 Construction Unit Rate Factor Modelling Using Neural Networks
Authors: Balimu Mwiya, Mundia Muya, Chabota Kaliba, Peter Mukalula
Abstract:
Factors affecting construction unit cost vary depending on a country’s political, economic, social and technological inclinations. Factors affecting construction costs have been studied from various perspectives. Analysis of cost factors requires an appreciation of a country’s practices. Identified cost factors provide an indication of a country’s construction economic strata. The purpose of this paper is to identify the essential factors that affect unit cost estimation and their breakdown using artificial neural networks. Twenty-five (25) identified cost factors in road construction were subjected to a questionnaire survey and employing SPSS factor analysis the factors were reduced to eight. The 8 factors were analysed using the neural network (NN) to determine the proportionate breakdown of the cost factors in a given construction unit rate. NN predicted that political environment accounted 44% of the unit rate followed by contractor capacity at 22% and financial delays, project feasibility, overhead and profit each at 11%. Project location, material availability and corruption perception index had minimal impact on the unit cost from the training data provided. Quantified cost factors can be incorporated in unit cost estimation models (UCEM) to produce more accurate estimates. This can create improvements in the cost estimation of infrastructure projects and establish a benchmark standard to assist the process of alignment of work practises and training of new staff, permitting the on-going development of best practises in cost estimation to become more effective.Keywords: construction cost factors, neural networks, roadworks, Zambian construction industry
Procedia PDF Downloads 3626627 Effect of Cloud Computing on Enterprises
Authors: Amir Rashid
Abstract:
Today is the world of innovations where everyone is looking for a change. Organizations are now looking toward virtualization in order to minimize their computing cost. Cloud Computing has also introduced itself by the means of reducing computing cost. It offers different approach to make computing better by improving utilization and reducing infrastructure and administrative costs. Cloud Computing is basically the amalgamation of Utility Computing and SaaS (Software as a Service). Cloud Computing is quite new to organizations as it is still at its deploying stage. Due to this reason, organizations are not confident whether to adopt it or not. This thesis investigates the problem for organization concerning the security and cost issues. Benefits and drawbacks are being highlighted which organizations can have or suffer in order to adopt Cloud Computing. In Conclusion, Cloud Computing is a better option available for small and medium organizations with a comparison to large companies both in terms of data security and cost.Keywords: cloud computing, security, cost, elasticity, PaaS, IaaS, SaaS
Procedia PDF Downloads 3406626 When Sex Matters: A Comparative Generalized Structural Equation Model (GSEM) for the Determinants of Stunting Amongst Under-fives in Uganda
Authors: Vallence Ngabo M., Leonard Atuhaire, Peter Clever Rutayisire
Abstract:
The main aim of this study was to establish the differences in both the determinants of stunting and the causal mechanism through which the identified determinants influence stunting amongst male and female under-fives in Uganda. Literature shows that male children below the age of five years are at a higher risk of being stunted than their female counterparts. Specifically, studies in Uganda indicate that being a male child is positively associated with stunting, while being a female is negatively associated with stunting. Data for 904 males and 829 females under-fives was extracted form UDHS-2016 survey dataset. Key variables for this study were identified and used in generating relevant models and paths. Structural equation modeling techniques were used in their generalized form (GSEM). The generalized nature necessitated specifying both the family and link functions for each response variable in the system of the model. The sex of the child (b4) was used as a grouping factor and the height for age (HAZ) scores were used to construct the status for stunting of under-fives. The estimated models and path clearly indicated that the set of underlying factors that influence male and female under-fives respectively was different and the path through which they influence stunting was different. However, some of the determinants that influenced stunting amongst male under-fives also influenced stunting amongst the female under-fives. To reduce the stunting problem to the desirable state, it is important to consider the multifaceted and complex nature of the risk factors that influence stunting amongst the under-fives but, more importantly, consider the different sex-specific factors and their causal mechanism or paths through which they influence stunting.Keywords: stunting, underfives, sex of the child, GSEM, causal mechanism
Procedia PDF Downloads 1406625 An Approach to Make Low-Cost Self-Compacting Geo-Polymer Concrete
Authors: Ankit Chakraborty, Raj Shah, Prayas Variya
Abstract:
Self-compacting geo-polymer concrete is a blended version of self-compacting concrete developed in Japan by Okamura. H. in 1986 and geo-polymer concrete proposed by Davidovits in 1999. This method is eco-friendly as there is low CO₂ emission and reduces labor cost due to its self-compacting property and zero percent cement content. We are making an approach to reduce concreting cost and make concreting eco-friendly by replacing cement fully and sand by a certain amount of industrial waste. It will reduce overall concreting cost due to its self-compatibility and replacement of materials, forms eco-friendly concreting technique and gives better fresh property and hardened property results compared to self-compacting concrete and geo-polymer concrete.Keywords: geopolymer concrete, low cost concreting, low carbon emission, self compactability
Procedia PDF Downloads 2326624 A Survey on Quasi-Likelihood Estimation Approaches for Longitudinal Set-ups
Authors: Naushad Mamode Khan
Abstract:
The Com-Poisson (CMP) model is one of the most popular discrete generalized linear models (GLMS) that handles both equi-, over- and under-dispersed data. In longitudinal context, an integer-valued autoregressive (INAR(1)) process that incorporates covariate specification has been developed to model longitudinal CMP counts. However, the joint likelihood CMP function is difficult to specify and thus restricts the likelihood based estimating methodology. The joint generalized quasilikelihood approach (GQL-I) was instead considered but is rather computationally intensive and may not even estimate the regression effects due to a complex and frequently ill conditioned covariance structure. This paper proposes a new GQL approach for estimating the regression parameters (GQLIII) that are based on a single score vector representation. The performance of GQL-III is compared with GQL-I and separate marginal GQLs (GQL-II) through some simulation experiments and is proved to yield equally efficient estimates as GQL-I and is far more computationally stable.Keywords: longitudinal, com-Poisson, ill-conditioned, INAR(1), GLMS, GQL
Procedia PDF Downloads 3546623 Generalized Model Estimating Strength of Bauxite Residue-Lime Mix
Authors: Sujeet Kumar, Arun Prasad
Abstract:
The present work investigates the effect of multiple parameters on the unconfined compressive strength of the bauxite residue-lime mix. A number of unconfined compressive strength tests considering various curing time, lime content, dry density and moisture content were carried out. The results show that an empirical correlation may be successfully developed using volumetric lime content, porosity, moisture content, curing time unconfined compressive strength for the range of the bauxite residue-lime mix studied. The proposed empirical correlations efficiently predict the strength of bauxite residue-lime mix, and it can be used as a generalized empirical equation to estimate unconfined compressive strength.Keywords: bauxite residue, curing time, porosity/volumetric lime ratio, unconfined compressive strength
Procedia PDF Downloads 2366622 Cost Effectiveness of Transcatheter Aortic Valve Replacement vs Surgical Aortic Valve Replacement in a Low-Middle Income Country
Authors: Vasuki Rayapati, Bhanu Duggal
Abstract:
Trans catheter aortic valve replacement (TAVR) is the recommended treatment over surgical aortic valve replacement (SAVR) for high-risk groups, patients >75 years of age with severe symptomatic Aortic stenosis (AS). In high income countries TAVR is more cost effective because of – i) Reduction in total length of stay including less number of days in ICU ii) Non-procedural costs like cost of general anaesthesia are higher for SAVR. In India, there are two kinds of hospitals – Public and Private. Most patients visit public sector hospitals than private sector hospitals. In a LMIC like India, especially in the Public health sector cost of TAVR is prohibitive. In a small study from three (public) hospitals in India, it was envisaged that cost of TAVR should decrease at least by 2/3 to be a cost effective option in Public health sector for severe AS.Keywords: cost effectiveness, TAVR vs SAVR, LMIC, HTA
Procedia PDF Downloads 1076621 Healthcare Utilization and Costs of Specific Obesity Related Health Conditions in Alberta, Canada
Authors: Sonia Butalia, Huong Luu, Alexis Guigue, Karen J. B. Martins, Khanh Vu, Scott W. Klarenbach
Abstract:
Obesity-related health conditions impose a substantial economic burden on payers due to increased healthcare use. Estimates of healthcare resource use and costs associated with obesity-related comorbidities are needed to inform policies and interventions targeting these conditions. Methods: Adults living with obesity were identified (a procedure-related body mass index code for class 2/3 obesity between 2012 and 2019 in Alberta, Canada; excluding those with bariatric surgery), and outcomes were compared over 1-year (2019/2020) between those who had and did not have specific obesity-related comorbidities. The probability of using a healthcare service (based on the odds ratio of a zero [OR-zero] cost) was compared; 95% confidence intervals (CI) were reported. Logistic regression and a generalized linear model with log link and gamma distribution were used for total healthcare cost comparisons ($CDN); cost ratios and estimated cost differences (95% CI) were reported. Potential socio-demographic and clinical confounders were adjusted for, and incremental cost differences were representative of a referent case. Results: A total of 220,190 adults living with obesity were included; 44% had hypertension, 25% had osteoarthritis, 24% had type-2 diabetes, 17% had cardiovascular disease, 12% had insulin resistance, 9% had chronic back pain, and 4% of females had polycystic ovarian syndrome (PCOS). The probability of hospitalization, ED visit, and ambulatory care was higher in those with a following obesity-related comorbidity versus those without: chronic back pain (hospitalization: 1.8-times [OR-zero: 0.57 [0.55/0.59]] / ED visit: 1.9-times [OR-zero: 0.54 [0.53/0.56]] / ambulatory care visit: 2.4-times [OR-zero: 0.41 [0.40/0.43]]), cardiovascular disease (2.7-times [OR-zero: 0.37 [0.36/0.38]] / 1.9-times [OR-zero: 0.52 [0.51/0.53]] / 2.8-times [OR-zero: 0.36 [0.35/0.36]]), osteoarthritis (2.0-times [OR-zero: 0.51 [0.50/0.53]] / 1.4-times [OR-zero: 0.74 [0.73/0.76]] / 2.5-times [OR-zero: 0.40 [0.40/0.41]]), type-2 diabetes (1.9-times [OR-zero: 0.54 [0.52/0.55]] / 1.4-times [OR-zero: 0.72 [0.70/0.73]] / 2.1-times [OR-zero: 0.47 [0.46/0.47]]), hypertension (1.8-times [OR-zero: 0.56 [0.54/0.57]] / 1.3-times [OR-zero: 0.79 [0.77/0.80]] / 2.2-times [OR-zero: 0.46 [0.45/0.47]]), PCOS (not significant / 1.2-times [OR-zero: 0.83 [0.79/0.88]] / not significant), and insulin resistance (1.1-times [OR-zero: 0.88 [0.84/0.91]] / 1.1-times [OR-zero: 0.92 [0.89/0.94]] / 1.8-times [OR-zero: 0.56 [0.54/0.57]]). After fully adjusting for potential confounders, the total healthcare cost ratio was higher in those with a following obesity-related comorbidity versus those without: chronic back pain (1.54-times [1.51/1.56]), cardiovascular disease (1.45-times [1.43/1.47]), osteoarthritis (1.36-times [1.35/1.38]), type-2 diabetes (1.30-times [1.28/1.31]), hypertension (1.27-times [1.26/1.28]), PCOS (1.08-times [1.05/1.11]), and insulin resistance (1.03-times [1.01/1.04]). Conclusions: Adults with obesity who have specific disease-related health conditions have a higher probability of healthcare use and incur greater costs than those without specific comorbidities; incremental costs are larger when other obesity-related health conditions are not adjusted for. In a specific referent case, hypertension was costliest (44% had this condition with an additional annual cost of $715 [$678/$753]). If these findings hold for the Canadian population, hypertension in persons with obesity represents an estimated additional annual healthcare cost of $2.5 billion among adults living with obesity (based on an adult obesity rate of 26%). Results of this study can inform decision making on investment in interventions that are effective in treating obesity and its complications.Keywords: administrative data, healthcare cost, obesity-related comorbidities, real world evidence
Procedia PDF Downloads 1486620 Applying Serious Game Design Frameworks to Existing Games for Integration of Custom Learning Objectives
Authors: Jonathan D. Moore, Mark G. Reith, David S. Long
Abstract:
Serious games (SGs) have been shown to be an effective teaching tool in many contexts. Because of the success of SGs, several design frameworks have been created to expedite the process of making original serious games to teach specific learning objectives (LOs). Even with these frameworks, the time required to create a custom SG from conception to implementation can range from months to years. Furthermore, it is even more difficult to design a game framework that allows an instructor to create customized game variants supporting multiple LOs within the same field. This paper proposes a refactoring methodology to apply the theoretical principles from well-established design frameworks to a pre-existing serious game. The expected result is a generalized game that can be quickly customized to teach LOs not originally targeted by the game. This methodology begins by describing the general components in a game, then uses a combination of two SG design frameworks to extract the teaching elements present in the game. The identified teaching elements are then used as the theoretical basis to determine the range of LOs that can be taught by the game. This paper evaluates the proposed methodology by presenting a case study of refactoring the serious game Battlespace Next (BSN) to teach joint military capabilities. The range of LOs that can be taught by the generalized BSN are identified, and examples of creating custom LOs are given. Survey results from users of the generalized game are also provided. Lastly, the expected impact of this work is discussed and a road map for future work and evaluation is presented.Keywords: serious games, learning objectives, game design, learning theory, game framework
Procedia PDF Downloads 1156619 Automated Resin Transfer Moulding of Carbon Phenolic Composites
Authors: Zhenyu Du, Ed Collings, James Meredith
Abstract:
The high cost of composite materials versus conventional materials remains a major barrier to uptake in the transport sector. This is exacerbated by a shortage of skilled labour which makes the labour content of a hand laid composite component (~40 % of total cost) an obvious target for reduction. Automation is a method to remove labour cost and improve quality. This work focuses on the challenges and benefits to automating the manufacturing process from raw fibre to trimmed component. It will detail the experimental work required to complete an automation cell, the control strategy used to integrate all machines and the final benefits in terms of throughput and cost.Keywords: automation, low cost technologies, processing and manufacturing technologies, resin transfer moulding
Procedia PDF Downloads 2926618 Application of Costing System in the Small and Medium Sized Enterprises (SME) in Turkey
Authors: Hamide Özyürek, Metin Yılmaz
Abstract:
Standard processes, similar and limited production lines, the production of high direct costs will be more accurate than the use of parts of the traditional cost systems in the literature. However, direct costs, overhead expenses, in turn, decreases the burden of increasingly sophisticated production facilities, a situation that led the researchers to look for the cost of traditional systems of alternative techniques. Variety cost management approaches for example Total quality management (TQM), just-in-time (JIT), benchmarking, kaizen costing, targeting cost, life cycle costs (LLC), activity-based costing (ABC) value engineering have been introduced. Management and cost applications have changed over the past decade and will continue to change. Modern cost systems can provide relevant and accurate cost information. These methods provide the decisions about customer, product and process improvement. The aim of study is to describe and explain the adoption and application of costing systems in SME. This purpose reports on a survey conducted during 2014 small and medium sized enterprises (SME) in Ankara. The survey results were evaluated using SPSS package program.Keywords: modern costing systems, managerial accounting, cost accounting, costing
Procedia PDF Downloads 5666617 Aircraft Line Maintenance Equipped with Decision Support System
Authors: B. Sudarsan Baskar, S. Pooja Pragati, S. Raj Kumar
Abstract:
The cost effectiveness in aircraft maintenance is of high privilege in the recent days. The cost effectiveness can be effectively made when line maintenance activities are incorporated at airports during Turn around time (TAT). The present work outcomes the shortcomings that affect the dispatching of the aircrafts, aiming at high fleet operability and low maintenance cost. The operational and cost constraints have been discussed and a suggestive alternative mechanism is proposed. The possible allocation of all deferred maintenance tasks to a set of all deferred maintenance tasks to a set of suitable airport resources have termed as alternative and is discussed in this paper from the data’s collected from the kingfisher airlines.Keywords: decision support system, aircraft maintenance planning, maintenance-cost, RUL(remaining useful life), logistics, supply chain management
Procedia PDF Downloads 5026616 MLProxy: SLA-Aware Reverse Proxy for Machine Learning Inference Serving on Serverless Computing Platforms
Authors: Nima Mahmoudi, Hamzeh Khazaei
Abstract:
Serving machine learning inference workloads on the cloud is still a challenging task at the production level. The optimal configuration of the inference workload to meet SLA requirements while optimizing the infrastructure costs is highly complicated due to the complex interaction between batch configuration, resource configurations, and variable arrival process. Serverless computing has emerged in recent years to automate most infrastructure management tasks. Workload batching has revealed the potential to improve the response time and cost-effectiveness of machine learning serving workloads. However, it has not yet been supported out of the box by serverless computing platforms. Our experiments have shown that for various machine learning workloads, batching can hugely improve the system’s efficiency by reducing the processing overhead per request. In this work, we present MLProxy, an adaptive reverse proxy to support efficient machine learning serving workloads on serverless computing systems. MLProxy supports adaptive batching to ensure SLA compliance while optimizing serverless costs. We performed rigorous experiments on Knative to demonstrate the effectiveness of MLProxy. We showed that MLProxy could reduce the cost of serverless deployment by up to 92% while reducing SLA violations by up to 99% that can be generalized across state-of-the-art model serving frameworks.Keywords: serverless computing, machine learning, inference serving, Knative, google cloud run, optimization
Procedia PDF Downloads 1796615 Apply Commitment Method in Power System to Minimize the Fuel Cost
Authors: Mohamed Shaban, Adel Yahya
Abstract:
The goal of this paper study is to schedule the power generation units to minimize fuel consumption cost based on a model that solves unit commitment problems. This can be done by utilizing forward dynamic programming method to determine the most economic scheduling of generating units. The model was applied to a power station, which consists of four generating units. The obtained results show that the applications of forward dynamic programming method offer a substantial reduction in fuel consumption cost. The fuel consumption cost has been reduced from $116,326 to $102,181 within a 24-hour period. This means saving about 12.16 % of fuel consumption cost. The study emphasizes the importance of applying modeling schedule programs to the operation of power generation units. As a consequence less consumption of fuel, less loss of power and less pollutionKeywords: unit commitment, forward dynamic, fuel cost, programming, generation scheduling, operation cost, power system, generating units
Procedia PDF Downloads 6116614 Bivariate Generalization of q-α-Bernstein Polynomials
Authors: Tarul Garg, P. N. Agrawal
Abstract:
We propose to define the q-analogue of the α-Bernstein Kantorovich operators and then introduce the q-bivariate generalization of these operators to study the approximation of functions of two variables. We obtain the rate of convergence of these bivariate operators by means of the total modulus of continuity, partial modulus of continuity and the Peetre’s K-functional for continuous functions. Further, in order to study the approximation of functions of two variables in a space bigger than the space of continuous functions, i.e. Bögel space; the GBS (Generalized Boolean Sum) of the q-bivariate operators is considered and degree of approximation is discussed for the Bögel continuous and Bögel differentiable functions with the aid of the Lipschitz class and the mixed modulus of smoothness.Keywords: Bögel continuous, Bögel differentiable, generalized Boolean sum, K-functional, mixed modulus of smoothness
Procedia PDF Downloads 3796613 Empirical Study of Correlation between the Cost Performance Index Stability and the Project Cost Forecast Accuracy in Construction Projects
Authors: Amin AminiKhafri, James M. Dawson-Edwards, Ryan M. Simpson, Simaan M. AbouRizk
Abstract:
Earned value management (EVM) has been introduced as an integrated method to combine schedule, budget, and work breakdown structure (WBS). EVM provides various indices to demonstrate project performance including the cost performance index (CPI). CPI is also used to forecast final project cost at completion based on the cost performance during the project execution. Knowing the final project cost during execution can initiate corrective actions, which can enhance project outputs. CPI, however, is not constant during the project, and calculating the final project cost using a variable index is an inaccurate and challenging task for practitioners. Since CPI is based on the cumulative progress values and because of the learning curve effect, CPI variation dampens and stabilizes as project progress. Although various definitions for the CPI stability have been proposed in literature, many scholars have agreed upon the definition that considers a project as stable if the CPI at 20% completion varies less than 0.1 from the final CPI. While 20% completion point is recognized as the stability point for military development projects, construction projects stability have not been studied. In the current study, an empirical study was first conducted using construction project data to determine the stability point for construction projects. Early findings have demonstrated that a majority of construction projects stabilize towards completion (i.e., after 70% completion point). To investigate the effect of CPI stability on cost forecast accuracy, the correlation between CPI stability and project cost at completion forecast accuracy was also investigated. It was determined that as projects progress closer towards completion, variation of the CPI decreases and final project cost forecast accuracy increases. Most projects were found to have 90% accuracy in the final cost forecast at 70% completion point, which is inlined with findings from the CPI stability findings. It can be concluded that early stabilization of the project CPI results in more accurate cost at completion forecasts.Keywords: cost performance index, earned value management, empirical study, final project cost
Procedia PDF Downloads 1566612 Time and Cost Efficiency Analysis of Quick Die Change System on Metal Stamping Industry
Authors: Rudi Kurniawan Arief
Abstract:
Manufacturing cost and setup time are the hot topics to improve in Metal Stamping industry because material and components price are always rising up while costumer requires to cut down the component price year by year. The Single Minute Exchange of Die (SMED) is one of many methods to reduce waste in stamping industry. The Japanese Quick Die Change (QDC) dies system is one of SMED systems that could reduce both of setup time and manufacturing cost. However, this system is rarely used in stamping industries. This paper will analyze how deep the QDC dies system could reduce setup time and the manufacturing cost. The research is conducted by direct observation, simulating and comparing of QDC dies system with conventional dies system. In this research, we found that the QDC dies system could save up to 35% of manufacturing cost and reduce 70% of setup times. This simulation proved that the QDC die system is effective for cost reduction but must be applied in several parallel production processes.Keywords: press die, metal stamping, QDC system, single minute exchange die, manufacturing cost saving, SMED
Procedia PDF Downloads 1706611 A Comparative Analysis of Residential Quality of Public and Private Estates in Lagos
Authors: S. Akinde, Jubril Olatunbosun
Abstract:
In recent years, most of the urban centers in Nigeria are fast experiencing housing problems such as unaffordable housing and environmental challenges, all of which determine the nature of housing quality. The population continues to increase and the demand for quality housing increases probably at the same rate. Several kinds of houses serve various purposes; the objectives of the low cost housing schemes as the name suggests is to make houses quality to both the middle and lower classes of people in Lagos. A casual look into the study area of Iba Low Cost Housing Estate and the Unity Low Cost Housing Estate, Ojo and Alimosho respectively in Lagos State have shown a huge demands for houses. The study area boasts of a large population all engaged in various commercial activities with income at various levels. It would be fair to say that these people are mainly of the middle class and lower class. This means the low cost housing scheme truly serves these purposes. A Low Cost Housing Scheme of Iba which is publicly owned and Low Cost Housing Scheme of Unity Estate (UE) is privately owned.Keywords: housing, residential quality, low cost housing scheme, public, private estates
Procedia PDF Downloads 5636610 Impact on Cost of Equity of Accounting and Disclosures
Authors: Abhishek Ranga
Abstract:
The study examined the effect of accounting choice and level of disclosure on the firm’s implied cost of equity in Indian environment. For the study accounting choice was classified as aggressive or conservative depending upon the firm’s choice of accounting methods, accounting policies and accounting estimates. Level of disclosure is the quantum of financial and non-financial information disclosed in firm’s annual report, essentially in note to accounts section, schedules forming part of financial statements and Management Discussion and Analysis report. Regression models were developed with cost of equity as a dependent variable and accounting choice, level of disclosure as an independent variable along with selected control variables. Cost of equity was measured using Edward-Bell-Ohlson (EBO) valuation model, to measure accounting choice Modified-Jones-Model (MJM) was used and level of disclosure was measured using a disclosure index essentially drawn from Botosan study. Results indicated a negative association between the implied cost of equity and conservative accounting choice and also between level of disclosure and cost of equity.Keywords: aggressive accounting choice, conservative accounting choice, disclosure, implied cost of equity
Procedia PDF Downloads 4626609 Closed-Form Sharma-Mittal Entropy Rate for Gaussian Processes
Authors: Septimia Sarbu
Abstract:
The entropy rate of a stochastic process is a fundamental concept in information theory. It provides a limit to the amount of information that can be transmitted reliably over a communication channel, as stated by Shannon's coding theorems. Recently, researchers have focused on developing new measures of information that generalize Shannon's classical theory. The aim is to design more efficient information encoding and transmission schemes. This paper continues the study of generalized entropy rates, by deriving a closed-form solution to the Sharma-Mittal entropy rate for Gaussian processes. Using the squeeze theorem, we solve the limit in the definition of the entropy rate, for different values of alpha and beta, which are the parameters of the Sharma-Mittal entropy. In the end, we compare it with Shannon and Rényi's entropy rates for Gaussian processes.Keywords: generalized entropies, Sharma-Mittal entropy rate, Gaussian processes, eigenvalues of the covariance matrix, squeeze theorem
Procedia PDF Downloads 5196608 The Association between Masculinity and Anxiety in Canadian Men
Authors: Nikk Leavitt, Peter Kellett, Cheryl Currie, Richard Larouche
Abstract:
Background: Masculinity has been associated with poor mental health outcomes in adult men and is colloquially referred to as toxic. Masculinity is traditionally measured using the Male Role Norms Inventory, which examines behaviors that may be common in men but that are themselves associated with poor mental health regardless of gender (e.g., aggressiveness). The purpose of this study was to examine if masculinity is associated with generalized anxiety among men using this inventory vs. a man’s personal definition of it. Method: An online survey collected data from 1,200 men aged 18-65 across Canada in July 2022. Masculinity was measured using: 1) the Male Role Norms Inventory Short Form and 2) by asking men to self-define what being masculine means. Men were then asked to rate the extent they perceived themselves to be masculine on a scale of 1 to 10 based on their definition of the construct. Generalized anxiety disorder was measured using the GAD-7. Multiple linear regression was used to examine associations between each masculinity score and anxiety score, adjusting for confounders. Results: The masculinity score measured using the inventory was positively associated with increased anxiety scores among men (β = 0.02, p < 0.01). Masculinity subscales most strongly correlated with higher anxiety were restrictive emotionality (β = 0.29, p < 0.01) and dominance (β = 0.30, p < 0.01). When traditional masculinity was replaced by a man’s self-rated masculinity score in the model, the reverse association was found, with increasing masculinity resulting in a significantly reduced anxiety score (β = -0.13, p = 0.04). Discussion: These findings highlight the need to revisit the ways in which masculinity is defined and operationalized in research to better understand its impacts on men’s mental health. The findings also highlight the importance of allowing participants to self-define gender-based constructs, given they are fluid and socially constructed.Keywords: masculinity, generalized anxiety disorder, race, intersectionality
Procedia PDF Downloads 716607 Habitat Model Review and a Proposed Methodology to Value Economic Trade-Off between Cage Culture and Habitat of an Endemic Species in Lake Maninjau, Indonesia
Authors: Ivana Yuniarti, Iwan Ridwansyah
Abstract:
This paper delivers a review of various methodologies for habitat assessment and a proposed methodology to assess an endemic fish species habitat in Lake Maninjau, Indonesia as a part of a Ph.D. project. This application is mainly aimed to assess the trade-off between the economic value of aquaculture and the fisheries. The proposed methodology is a generalized linear model (GLM) combined with GIS to assess presence-absence data or habitat suitability index (HSI) combined with the analytical hierarchy process (AHP). Further, a cost of habitat replacement approach is planned to be used to calculate the habitat value as well as its trade-off with the economic value of aquaculture. The result of the study is expected to be a scientific consideration in local decision making and to provide a reference for other areas in the country.Keywords: AHP, habitat, GLM, HSI, Maninjau
Procedia PDF Downloads 1526606 The Study of Cost Accounting in S Company Based on TDABC
Authors: Heng Ma
Abstract:
Third-party warehousing logistics has an important role in the development of external logistics. At present, the third-party logistics in our country is still a new industry, the accounting system has not yet been established, the current financial accounting system of third-party warehousing logistics is mainly in the traditional way of thinking, and only able to provide the total cost information of the entire enterprise during the accounting period, unable to reflect operating indirect cost information. In order to solve the problem of third-party logistics industry cost information distortion, improve the level of logistics cost management, the paper combines theoretical research and case analysis method to reflect cost allocation by building third-party logistics costing model using Time-Driven Activity-Based Costing(TDABC), and takes S company as an example to account and control the warehousing logistics cost. Based on the idea of “Products consume activities and activities consume resources”, TDABC put time into the main cost driver and use time-consuming equation resources assigned to cost objects. In S company, the objects focuses on three warehouse, engaged with warehousing and transportation (the second warehouse, transport point) service. These three warehouse respectively including five departments, Business Unit, Production Unit, Settlement Center, Security Department and Equipment Division, the activities in these departments are classified by in-out of storage forecast, in-out of storage or transit and safekeeping work. By computing capacity cost rate, building the time-consuming equation, the paper calculates the final operation cost so as to reveal the real cost. The numerical analysis results show that the TDABC can accurately reflect the cost allocation of service customers and reveal the spare capacity cost of resource center, verifies the feasibility and validity of TDABC in third-party logistics industry cost accounting. It inspires enterprises focus on customer relationship management and reduces idle cost to strengthen the cost management of third-party logistics enterprises.Keywords: third-party logistics enterprises, TDABC, cost management, S company
Procedia PDF Downloads 3586605 A Problem on Homogeneous Isotropic Microstretch Thermoelastic Half Space with Mass Diffusion Medium under Different Theories
Authors: Devinder Singh, Rajneesh Kumar, Arvind Kumar
Abstract:
The present investigation deals with generalized model of the equations for a homogeneous isotropic microstretch thermoelastic half space with mass diffusion medium. Theories of generalized thermoelasticity Lord-Shulman (LS) Green-Lindsay (GL) and Coupled Theory (CT) theories are applied to investigate the problem. The stresses in the considered medium have been studied due to normal force and tangential force. The normal mode analysis technique is used to calculate the normal stress, shear stress, couple stresses and microstress. A numerical computation has been performed on the resulting quantity. The computed numerical results are shown graphically.Keywords: microstretch, thermoelastic, normal mode analysis, normal and tangential force, microstress force
Procedia PDF Downloads 5356604 Modelling Volatility of Cryptocurrencies: Evidence from GARCH Family of Models with Skewed Error Innovation Distributions
Authors: Timothy Kayode Samson, Adedoyin Isola Lawal
Abstract:
The past five years have shown a sharp increase in public interest in the crypto market, with its market capitalization growing from $100 billion in June 2017 to $2158.42 billion on April 5, 2022. Despite the outrageous nature of the volatility of cryptocurrencies, the use of skewed error innovation distributions in modelling the volatility behaviour of these digital currencies has not been given much research attention. Hence, this study models the volatility of 5 largest cryptocurrencies by market capitalization (Bitcoin, Ethereum, Tether, Binance coin, and USD Coin) using four variants of GARCH models (GJR-GARCH, sGARCH, EGARCH, and APARCH) estimated using three skewed error innovation distributions (skewed normal, skewed student- t and skewed generalized error innovation distributions). Daily closing prices of these currencies were obtained from Yahoo Finance website. Finding reveals that the Binance coin reported higher mean returns compared to other digital currencies, while the skewness indicates that the Binance coin, Tether, and USD coin increased more than they decreased in values within the period of study. For both Bitcoin and Ethereum, negative skewness was obtained, meaning that within the period of study, the returns of these currencies decreased more than they increased in value. Returns from these cryptocurrencies were found to be stationary but not normality distributed with evidence of the ARCH effect. The skewness parameters in all best forecasting models were all significant (p<.05), justifying of use of skewed error innovation distributions with a fatter tail than normal, Student-t, and generalized error innovation distributions. For Binance coin, EGARCH-sstd outperformed other volatility models, while for Bitcoin, Ethereum, Tether, and USD coin, the best forecasting models were EGARCH-sstd, APARCH-sstd, EGARCH-sged, and GJR-GARCH-sstd, respectively. This suggests the superiority of skewed Student t- distribution and skewed generalized error distribution over the skewed normal distribution.Keywords: skewed generalized error distribution, skewed normal distribution, skewed student t- distribution, APARCH, EGARCH, sGARCH, GJR-GARCH
Procedia PDF Downloads 1186603 Cost-Optimized Extra-Lateral Transshipments
Authors: Dilupa Nakandala, Henry Lau
Abstract:
Ever increasing demand for cost efficiency and customer satisfaction through reliable delivery have been a mandate for logistics practitioners to continually improve inventory management processes. With the cost optimization objectives, this study considers an extended scenario where sourcing from the same echelon of the supply chain, known as lateral transshipment which is instantaneous but more expensive than purchasing from regular suppliers, is considered by warehouses not only to re-actively fulfill the urgent outstanding retailer demand that could not be fulfilled by stock on hand but also for preventively reduce back-order cost. Such extra lateral trans-shipments as preventive responses are intended to meet the expected demand during the supplier lead time in a periodic review ordering policy setting. We develop decision rules to assist logistics practitioners to make cost optimized selection between back-ordering and combined reactive and proactive lateral transshipment options. A method for determining the optimal quantity of extra lateral transshipment is developed considering the trade-off between purchasing, holding and backorder cost components.Keywords: lateral transshipment, warehouse inventory management, cost optimization, preventive transshipment
Procedia PDF Downloads 6156602 Waterborne Platooning: Cost and Logistic Analysis of Vessel Trains
Authors: Alina P. Colling, Robert G. Hekkenberg
Abstract:
Recent years have seen extensive technological advancement in truck platooning, as reflected in the literature. Its main benefits are the improvement of traffic stability and the reduction of air drag, resulting in less fuel consumption, in comparison to using individual trucks. Platooning is now being adapted to the waterborne transport sector in the NOVIMAR project through the development of a Vessel Train (VT) concept. The main focus of VT’s, as opposed to the truck platoons, is the decrease in manning on board, ultimately working towards autonomous vessel operations. This crew reduction can prove to be an important selling point in achieving economic competitiveness of the waterborne approach when compared to alternative modes of transport. This paper discusses the expected benefits and drawbacks of the VT concept, in terms of the technical logistic performance and generalized costs. More specifically, VT’s can provide flexibility in destination choices for shippers but also add complexity when performing special manoeuvres in VT formation. In order to quantify the cost and performances, a model is developed and simulations are carried out for various case studies. These compare the application of VT’s in the short sea and inland water transport, with specific sailing regimes and technologies installed on board to allow different levels of autonomy. The results enable the identification of the most important boundary conditions for the successful operation of the waterborne platooning concept. These findings serve as a framework for future business applications of the VT.Keywords: autonomous vessels, NOVIMAR, vessel trains, waterborne platooning
Procedia PDF Downloads 2226601 A Data Driven Methodological Approach to Economic Pre-Evaluation of Reuse Projects of Ancient Urban Centers
Authors: Pietro D'Ambrosio, Roberta D'Ambrosio
Abstract:
The upgrading of the architectural and urban heritage of the urban historic centers almost always involves the planning for the reuse and refunctionalization of the structures. Such interventions have complexities linked to the need to take into account the urban and social context in which the structure and its intrinsic characteristics such as historical and artistic value are inserted. To these, of course, we have to add the need to make a preliminary estimate of recovery costs and more generally to assess the economic and financial sustainability of the whole project of re-socialization. Particular difficulties are encountered during the pre-assessment of costs since it is often impossible to perform analytical surveys and structural tests for both structural conditions and obvious cost and time constraints. The methodology proposed in this work, based on a multidisciplinary and data-driven approach, is aimed at obtaining, at very low cost, reasonably priced economic evaluations of the interventions to be carried out. In addition, the specific features of the approach used, derived from the predictive analysis techniques typically applied in complex IT domains (big data analytics), allow to obtain as a result indirectly the evaluation process of a shared database that can be used on a generalized basis to estimate such other projects. This makes the methodology particularly indicated in those cases where it is expected to intervene massively across entire areas of historical city centers. The methodology has been partially tested during a study aimed at assessing the feasibility of a project for the reuse of the monumental complex of San Massimo, located in the historic center of Salerno, and is being further investigated.Keywords: evaluation, methodology, restoration, reuse
Procedia PDF Downloads 1876600 Estimation of Rare and Clustered Population Mean Using Two Auxiliary Variables in Adaptive Cluster Sampling
Authors: Muhammad Nouman Qureshi, Muhammad Hanif
Abstract:
Adaptive cluster sampling (ACS) is specifically developed for the estimation of highly clumped populations and applied to a wide range of situations like animals of rare and endangered species, uneven minerals, HIV patients and drug users. In this paper, we proposed a generalized semi-exponential estimator with two auxiliary variables under the framework of ACS design. The expressions of approximate bias and mean square error (MSE) of the proposed estimator are derived. Theoretical comparisons of the proposed estimator have been made with existing estimators. A numerical study is conducted on real and artificial populations to demonstrate and compare the efficiencies of the proposed estimator. The results indicate that the proposed generalized semi-exponential estimator performed considerably better than all the adaptive and non-adaptive estimators considered in this paper.Keywords: auxiliary information, adaptive cluster sampling, clustered populations, Hansen-Hurwitz estimation
Procedia PDF Downloads 2386599 Stochastic Matrices and Lp Norms for Ill-Conditioned Linear Systems
Authors: Riadh Zorgati, Thomas Triboulet
Abstract:
In quite diverse application areas such as astronomy, medical imaging, geophysics or nondestructive evaluation, many problems related to calibration, fitting or estimation of a large number of input parameters of a model from a small amount of output noisy data, can be cast as inverse problems. Due to noisy data corruption, insufficient data and model errors, most inverse problems are ill-posed in a Hadamard sense, i.e. existence, uniqueness and stability of the solution are not guaranteed. A wide class of inverse problems in physics relates to the Fredholm equation of the first kind. The ill-posedness of such inverse problem results, after discretization, in a very ill-conditioned linear system of equations, the condition number of the associated matrix can typically range from 109 to 1018. This condition number plays the role of an amplifier of uncertainties on data during inversion and then, renders the inverse problem difficult to handle numerically. Similar problems appear in other areas such as numerical optimization when using interior points algorithms for solving linear programs leads to face ill-conditioned systems of linear equations. Devising efficient solution approaches for such system of equations is therefore of great practical interest. Efficient iterative algorithms are proposed for solving a system of linear equations. The approach is based on a preconditioning of the initial matrix of the system with an approximation of a generalized inverse leading to a stochastic preconditioned matrix. This approach, valid for non-negative matrices, is first extended to hermitian, semi-definite positive matrices and then generalized to any complex rectangular matrices. The main results obtained are as follows: 1) We are able to build a generalized inverse of any complex rectangular matrix which satisfies the convergence condition requested in iterative algorithms for solving a system of linear equations. This completes the (short) list of generalized inverse having this property, after Kaczmarz and Cimmino matrices. Theoretical results on both the characterization of the type of generalized inverse obtained and the convergence are derived. 2) Thanks to its properties, this matrix can be efficiently used in different solving schemes as Richardson-Tanabe or preconditioned conjugate gradients. 3) By using Lp norms, we propose generalized Kaczmarz’s type matrices. We also show how Cimmino's matrix can be considered as a particular case consisting in choosing the Euclidian norm in an asymmetrical structure. 4) Regarding numerical results obtained on some pathological well-known test-cases (Hilbert, Nakasaka, …), some of the proposed algorithms are empirically shown to be more efficient on ill-conditioned problems and more robust to error propagation than the known classical techniques we have tested (Gauss, Moore-Penrose inverse, minimum residue, conjugate gradients, Kaczmarz, Cimmino). We end on a very early prospective application of our approach based on stochastic matrices aiming at computing some parameters (such as the extreme values, the mean, the variance, …) of the solution of a linear system prior to its resolution. Such an approach, if it were to be efficient, would be a source of information on the solution of a system of linear equations.Keywords: conditioning, generalized inverse, linear system, norms, stochastic matrix
Procedia PDF Downloads 133