Search results for: empirical validation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3775

Search results for: empirical validation

3415 Estimation of Scour Using a Coupled Computational Fluid Dynamics and Discrete Element Model

Authors: Zeinab Yazdanfar, Dilan Robert, Daniel Lester, S. Setunge

Abstract:

Scour has been identified as the most common threat to bridge stability worldwide. Traditionally, scour around bridge piers is calculated using the empirical approaches that have considerable limitations and are difficult to generalize. The multi-physic nature of scouring which involves turbulent flow, soil mechanics and solid-fluid interactions cannot be captured by simple empirical equations developed based on limited laboratory data. These limitations can be overcome by direct numerical modeling of coupled hydro-mechanical scour process that provides a robust prediction of bridge scour and valuable insights into the scour process. Several numerical models have been proposed in the literature for bridge scour estimation including Eulerian flow models and coupled Euler-Lagrange models incorporating an empirical sediment transport description. However, the contact forces between particles and the flow-particle interaction haven’t been taken into consideration. Incorporating collisional and frictional forces between soil particles as well as the effect of flow-driven forces on particles will facilitate accurate modeling of the complex nature of scour. In this study, a coupled Computational Fluid Dynamics and Discrete Element Model (CFD-DEM) has been developed to simulate the scour process that directly models the hydro-mechanical interactions between the sediment particles and the flowing water. This approach obviates the need for an empirical description as the fundamental fluid-particle, and particle-particle interactions are fully resolved. The sediment bed is simulated as a dense pack of particles and the frictional and collisional forces between particles are calculated, whilst the turbulent fluid flow is modeled using a Reynolds Averaged Navier Stocks (RANS) approach. The CFD-DEM model is validated against experimental data in order to assess the reliability of the CFD-DEM model. The modeling results reveal the criticality of particle impact on the assessment of scour depth which, to the authors’ best knowledge, hasn’t been considered in previous studies. The results of this study open new perspectives to the scour depth and time assessment which is the key to manage the failure risk of bridge infrastructures.

Keywords: bridge scour, discrete element method, CFD-DEM model, multi-phase model

Procedia PDF Downloads 109
3414 Preparing Data for Calibration of Mechanistic-Empirical Pavement Design Guide in Central Saudi Arabia

Authors: Abdulraaof H. Alqaili, Hamad A. Alsoliman

Abstract:

Through progress in pavement design developments, a pavement design method was developed, which is titled the Mechanistic Empirical Pavement Design Guide (MEPDG). Nowadays, the evolution in roads network and highways is observed in Saudi Arabia as a result of increasing in traffic volume. Therefore, the MEPDG currently is implemented for flexible pavement design by the Saudi Ministry of Transportation. Implementation of MEPDG for local pavement design requires the calibration of distress models under the local conditions (traffic, climate, and materials). This paper aims to prepare data for calibration of MEPDG in Central Saudi Arabia. Thus, the first goal is data collection for the design of flexible pavement from the local conditions of the Riyadh region. Since, the modifying of collected data to input data is needed; the main goal of this paper is the analysis of collected data. The data analysis in this paper includes processing each: Trucks Classification, Traffic Growth Factor, Annual Average Daily Truck Traffic (AADTT), Monthly Adjustment Factors (MAFi), Vehicle Class Distribution (VCD), Truck Hourly Distribution Factors, Axle Load Distribution Factors (ALDF), Number of axle types (single, tandem, and tridem) per truck class, cloud cover percent, and road sections selected for the local calibration. Detailed descriptions of input parameters are explained in this paper, which leads to providing of an approach for successful implementation of MEPDG. Local calibration of MEPDG to the conditions of Riyadh region can be performed based on the findings in this paper.

Keywords: mechanistic-empirical pavement design guide (MEPDG), traffic characteristics, materials properties, climate, Riyadh

Procedia PDF Downloads 208
3413 Analyzing the Effects of Real Income and Biomass Energy Consumption on Carbon Dioxide (CO2) Emissions: Empirical Evidence from the Panel of Biomass-Consuming Countries

Authors: Eyup Dogan

Abstract:

This empirical aims to analyze the impacts of real income and biomass energy consumption on the level of emissions in the EKC model for the panel of biomass-consuming countries over the period 1980-2011. Because we detect the presence of cross-sectional dependence and heterogeneity across countries for the analyzed data, we use panel estimation methods robust to cross-sectional dependence and heterogeneity. The CADF and the CIPS panel unit root tests indicate that carbon emissions, real income and biomass energy consumption are stationary at the first-differences. The LM bootstrap panel cointegration test shows that the analyzed variables are cointegrated. Results from the panel group-mean DOLS and the panel group-mean FMOLS estimators show that increase in biomass energy consumption decreases CO2 emissions and the EKC hypothesis is validated. Therefore, countries are advised to boost their production and increase the use of biomass energy for lower level of emissions.

Keywords: biomass energy, CO2 emissions, EKC model, heterogeneity, cross-sectional dependence

Procedia PDF Downloads 273
3412 Study of Clutch Cable Architecture and Its Influence in Efficiency of Mechanical Cable Release System

Authors: M. Devamanalan, K. Pothiraj, M. Sudhan

Abstract:

In competitive market like India, there is a high demand on the equal contribution on performance and its durability aspect of any system. In General vehicle has multiple sub-systems such as powertrain, BIW, Brakes, Actuations, Suspension and Seats etc., To withstand the market challenges, the contribution of each sub-system is very vital. The malfunction of any one sub system will directly have an impact on the performance of the major system which lead to dis-satisfaction to the end user. The Powertrain system consists of several sub-systems in which clutch is one of the prime sub-systems in MT vehicles which assist for smoother gear shifts with proper clutch dis-engagement and engagement. In general, most of the vehicles will have a mechanical or semi or full hydraulic clutch release system, whereas in small Commercial Vehicles (SCV) the majorly used clutch release system is mechanical cable release system due to its lesser cost and functional requirements. The major bottle neck in the cable type clutch release system is increase in pedal effort due to hysteresis increase and Gear shifting hard due to efficiency loss / cable slackness over the mileage accumulation of the vehicle. This study is to mainly focus on how the efficiency and hysteresis change over the mileage of the vehicle occurs because of the design architecture of outer and inner cable. The study involves several cable design validation results from vehicle level and rig level through the defined cable routing and test procedures. Results are compared to evaluate the suitable cable design architecture based on better efficiency and lower hysteresis parameters at initial and end of the validation.

Keywords: clutch, clutch cable, efficiency, architecture, cable routing

Procedia PDF Downloads 93
3411 Performance and Limitations of Likelihood Based Information Criteria and Leave-One-Out Cross-Validation Approximation Methods

Authors: M. A. C. S. Sampath Fernando, James M. Curran, Renate Meyer

Abstract:

Model assessment, in the Bayesian context, involves evaluation of the goodness-of-fit and the comparison of several alternative candidate models for predictive accuracy and improvements. In posterior predictive checks, the data simulated under the fitted model is compared with the actual data. Predictive model accuracy is estimated using information criteria such as the Akaike information criterion (AIC), the Bayesian information criterion (BIC), the Deviance information criterion (DIC), and the Watanabe-Akaike information criterion (WAIC). The goal of an information criterion is to obtain an unbiased measure of out-of-sample prediction error. Since posterior checks use the data twice; once for model estimation and once for testing, a bias correction which penalises the model complexity is incorporated in these criteria. Cross-validation (CV) is another method used for examining out-of-sample prediction accuracy. Leave-one-out cross-validation (LOO-CV) is the most computationally expensive variant among the other CV methods, as it fits as many models as the number of observations. Importance sampling (IS), truncated importance sampling (TIS) and Pareto-smoothed importance sampling (PSIS) are generally used as approximations to the exact LOO-CV and utilise the existing MCMC results avoiding expensive computational issues. The reciprocals of the predictive densities calculated over posterior draws for each observation are treated as the raw importance weights. These are in turn used to calculate the approximate LOO-CV of the observation as a weighted average of posterior densities. In IS-LOO, the raw weights are directly used. In contrast, the larger weights are replaced by their modified truncated weights in calculating TIS-LOO and PSIS-LOO. Although, information criteria and LOO-CV are unable to reflect the goodness-of-fit in absolute sense, the differences can be used to measure the relative performance of the models of interest. However, the use of these measures is only valid under specific circumstances. This study has developed 11 models using normal, log-normal, gamma, and student’s t distributions to improve the PCR stutter prediction with forensic data. These models are comprised of four with profile-wide variances, four with locus specific variances, and three which are two-component mixture models. The mean stutter ratio in each model is modeled as a locus specific simple linear regression against a feature of the alleles under study known as the longest uninterrupted sequence (LUS). The use of AIC, BIC, DIC, and WAIC in model comparison has some practical limitations. Even though, IS-LOO, TIS-LOO, and PSIS-LOO are considered to be approximations of the exact LOO-CV, the study observed some drastic deviations in the results. However, there are some interesting relationships among the logarithms of pointwise predictive densities (lppd) calculated under WAIC and the LOO approximation methods. The estimated overall lppd is a relative measure that reflects the overall goodness-of-fit of the model. Parallel log-likelihood profiles for the models conditional on equal posterior variances in lppds were observed. This study illustrates the limitations of the information criteria in practical model comparison problems. In addition, the relationships among LOO-CV approximation methods and WAIC with their limitations are discussed. Finally, useful recommendations that may help in practical model comparisons with these methods are provided.

Keywords: cross-validation, importance sampling, information criteria, predictive accuracy

Procedia PDF Downloads 371
3410 An Empirical Study on Growth, Trade, Foreign Direct Investment and Environment in India

Authors: Shilpi Tripathi

Abstract:

India has adopted the policy of economic reforms (Globalization, Liberalization, and Privatization) in 1991 which has reduced the trade barriers and investment restrictions and further increased the economy’s international trade, foreign direct investment (FDI) inflows and Gross Domestic Product (GDP) growth. The paper empirically studies the relationship between India’s international trades, GDP, FDI and environment during 1978-2012. The first part of the paper focuses on the background and trends of FDI, GDP, trade, and environment (CO2). The second part focuses on the literature regarding the relationship among all the variables. The last part of paper, we examine the results of empirical analysis like co integration and Granger causality between foreign trade, FDI inflows, GDP and CO2 since 1978. The findings of the paper revealed that there is only one uni- directional causality exists between GDP and trade. The direction of causality reveals that international trade is one of the major contributors to the economic growth (GDP). While, there is no causality found between GDP and FDI, FDI, and CO2 and International trade and CO2. The paper concludes with the policy recommendations that will ensure environmental friendly trade, investment and growth in India for future.

Keywords: international trade, foreign direct investment, GDP, CO2, co-integration, granger causality test

Procedia PDF Downloads 419
3409 Systematic Review of Quantitative Risk Assessment Tools and Their Effect on Racial Disproportionality in Child Welfare Systems

Authors: Bronwen Wade

Abstract:

Over the last half-century, child welfare systems have increasingly relied on quantitative risk assessment tools, such as actuarial or predictive risk tools. These tools are developed by performing statistical analysis of how attributes captured in administrative data are related to future child maltreatment. Some scholars argue that attributes in administrative data can serve as proxies for race and that quantitative risk assessment tools reify racial bias in decision-making. Others argue that these tools provide more “objective” and “scientific” guides for decision-making instead of subjective social worker judgment. This study performs a systematic review of the literature on the impact of quantitative risk assessment tools on racial disproportionality; it examines methodological biases in work on this topic, summarizes key findings, and provides suggestions for further work. A search of CINAHL, PsychInfo, Proquest Social Science Premium Collection, and the ProQuest Dissertations and Theses Collection was performed. Academic and grey literature were included. The review includes studies that use quasi-experimental methods and development, validation, or re-validation studies of quantitative risk assessment tools. PROBAST (Prediction model Risk of Bias Assessment Tool) and CHARMS (CHecklist for critical Appraisal and data extraction for systematic Reviews of prediction Modelling Studies) were used to assess the risk of bias and guide data extraction for risk development, validation, or re-validation studies. ROBINS-I (Risk of Bias in Non-Randomized Studies of Interventions) was used to assess for bias and guide data extraction for the quasi-experimental studies identified. Due to heterogeneity among papers, a meta-analysis was not feasible, and a narrative synthesis was conducted. 11 papers met the eligibility criteria, and each has an overall high risk of bias based on the PROBAST and ROBINS-I assessments. This is deeply concerning, as major policy decisions have been made based on a limited number of studies with a high risk of bias. The findings on racial disproportionality have been mixed and depend on the tool and approach used. Authors use various definitions for racial equity, fairness, or disproportionality. These concepts of statistical fairness are connected to theories about the reason for racial disproportionality in child welfare or social definitions of fairness that are usually not stated explicitly. Most findings from these studies are unreliable, given the high degree of bias. However, some of the less biased measures within studies suggest that quantitative risk assessment tools may worsen racial disproportionality, depending on how disproportionality is mathematically defined. Authors vary widely in their approach to defining and addressing racial disproportionality within studies, making it difficult to generalize findings or approaches across studies. This review demonstrates the power of authors to shape policy or discourse around racial justice based on their choice of statistical methods; it also demonstrates the need for improved rigor and transparency in studies of quantitative risk assessment tools. Finally, this review raises concerns about the impact that these tools have on child welfare systems and racial disproportionality.

Keywords: actuarial risk, child welfare, predictive risk, racial disproportionality

Procedia PDF Downloads 29
3408 Exploring the Effect of Accounting Information on Systematic Risk: An Empirical Evidence of Tehran Stock Exchange

Authors: Mojtaba Rezaei, Elham Heydari

Abstract:

This paper highlights the empirical results of analyzing the correlation between accounting information and systematic risk. This association is analyzed among financial ratios and systematic risk by considering the financial statement of 39 companies listed on the Tehran Stock Exchange (TSE) for five years (2014-2018). Financial ratios have been categorized into four groups and to describe the special features, as representative of accounting information we selected: Return on Asset (ROA), Debt Ratio (Total Debt to Total Asset), Current Ratio (current assets to current debt), Asset Turnover (Net sales to Total assets), and Total Assets. The hypotheses were tested through simple and multiple linear regression and T-student test. The findings illustrate that there is no significant relationship between accounting information and market risk. This indicates that in the selected sample, historical accounting information does not fully reflect the price of stocks.

Keywords: accounting information, market risk, systematic risk, stock return, efficient market hypothesis, EMH, Tehran stock exchange, TSE

Procedia PDF Downloads 111
3407 Conditions Required for New Sector Emergence: Results from a Systematic Literature Review

Authors: Laurie Prange-Martin, Romeo Turcan, Norman Fraser

Abstract:

The aim of this study is to identify the conditions required and describe the process of emergence for a new economic sector created from new or established businesses. A systematic literature review of English-language studies published from 1983 to 2016 was conducted using the following databases: ABI/INFORM Complete; Business Source Premiere; Google Scholar; Scopus; and Web of Science. The two main terms of business sector and emergence were used in the systematic literature search, along with another seventeen synonyms for each these main terms. From the search results, 65 publications met the requirements of an empirical study discussing and reporting the conditions of new sector emergence. A meta-analysis of the literature examined suggest that there are six favourable conditions and five key individuals or groups required for new sector emergence. In addition, the results from the meta-analysis showed that there are eighteen theories used in the literature to explain the phenomenon of new sector emergence, which can be grouped in three study disciplines. With such diversity in theoretical frameworks used in the 65 empirical studies, the authors of this paper propose the development of a new theory of sector emergence.

Keywords: economic geography, new sector emergence, economic diversification, regional economies

Procedia PDF Downloads 252
3406 Predictive Analytics for Theory Building

Authors: Ho-Won Jung, Donghun Lee, Hyung-Jin Kim

Abstract:

Predictive analytics (data analysis) uses a subset of measurements (the features, predictor, or independent variable) to predict another measurement (the outcome, target, or dependent variable) on a single person or unit. It applies empirical methods in statistics, operations research, and machine learning to predict the future, or otherwise unknown events or outcome on a single or person or unit, based on patterns in data. Most analyses of metabolic syndrome are not predictive analytics but statistical explanatory studies that build a proposed model (theory building) and then validate metabolic syndrome predictors hypothesized (theory testing). A proposed theoretical model forms with causal hypotheses that specify how and why certain empirical phenomena occur. Predictive analytics and explanatory modeling have their own territories in analysis. However, predictive analytics can perform vital roles in explanatory studies, i.e., scientific activities such as theory building, theory testing, and relevance assessment. In the context, this study is to demonstrate how to use our predictive analytics to support theory building (i.e., hypothesis generation). For the purpose, this study utilized a big data predictive analytics platform TM based on a co-occurrence graph. The co-occurrence graph is depicted with nodes (e.g., items in a basket) and arcs (direct connections between two nodes), where items in a basket are fully connected. A cluster is a collection of fully connected items, where the specific group of items has co-occurred in several rows in a data set. Clusters can be ranked using importance metrics, such as node size (number of items), frequency, surprise (observed frequency vs. expected), among others. The size of a graph can be represented by the numbers of nodes and arcs. Since the size of a co-occurrence graph does not depend directly on the number of observations (transactions), huge amounts of transactions can be represented and processed efficiently. For a demonstration, a total of 13,254 metabolic syndrome training data is plugged into the analytics platform to generate rules (potential hypotheses). Each observation includes 31 predictors, for example, associated with sociodemographic, habits, and activities. Some are intentionally included to get predictive analytics insights on variable selection such as cancer examination, house type, and vaccination. The platform automatically generates plausible hypotheses (rules) without statistical modeling. Then the rules are validated with an external testing dataset including 4,090 observations. Results as a kind of inductive reasoning show potential hypotheses extracted as a set of association rules. Most statistical models generate just one estimated equation. On the other hand, a set of rules (many estimated equations from a statistical perspective) in this study may imply heterogeneity in a population (i.e., different subpopulations with unique features are aggregated). Next step of theory development, i.e., theory testing, statistically tests whether a proposed theoretical model is a plausible explanation of a phenomenon interested in. If hypotheses generated are tested statistically with several thousand observations, most of the variables will become significant as the p-values approach zero. Thus, theory validation needs statistical methods utilizing a part of observations such as bootstrap resampling with an appropriate sample size.

Keywords: explanatory modeling, metabolic syndrome, predictive analytics, theory building

Procedia PDF Downloads 248
3405 Strategic Citizen Participation in Applied Planning Investigations: How Planners Use Etic and Emic Community Input Perspectives to Fill-in the Gaps in Their Analysis

Authors: John Gaber

Abstract:

Planners regularly use citizen input as empirical data to help them better understand community issues they know very little about. This type of community data is based on the lived experiences of local residents and is known as "emic" data. What is becoming more common practice for planners is their use of data from local experts and stakeholders (known as "etic" data or the outsider perspective) to help them fill in the gaps in their analysis of applied planning research projects. Utilizing international Health Impact Assessment (HIA) data, I look at who planners invite to their citizen input investigations. Research presented in this paper shows that planners access a wide range of emic and etic community perspectives in their search for the “community’s view.” The paper concludes with how planners can chart out a new empirical path in their execution of emic/etic citizen participation strategies in their applied planning research projects.

Keywords: citizen participation, emic data, etic data, Health Impact Assessment (HIA)

Procedia PDF Downloads 469
3404 5G Future Hyper-Dense Networks: An Empirical Study and Standardization Challenges

Authors: W. Hashim, H. Burok, N. Ghazaly, H. Ahmad Nasir, N. Mohamad Anas, A. F. Ismail, K. L. Yau

Abstract:

Future communication networks require devices that are able to work on a single platform but support heterogeneous operations which lead to service diversity and functional flexibility. This paper proposes two cognitive mechanisms termed cognitive hybrid function which is applied in multiple broadband user terminals in order to maintain reliable connectivity and preventing unnecessary interferences. By employing such mechanisms especially for future hyper-dense network, we can observe their performances in terms of optimized speed and power saving efficiency. Results were obtained from several empirical laboratory studies. It was found that selecting reliable network had shown a better optimized speed performance up to 37% improvement as compared without such function. In terms of power adjustment, our evaluation of this mechanism can reduce the power to 5dB while maintaining the same level of throughput at higher power performance. We also discuss the issues impacting future telecommunication standards whenever such devices get in place.

Keywords: dense network, intelligent network selection, multiple networks, transmit power adjustment

Procedia PDF Downloads 353
3403 Price to Earnings Growth (PEG) Predicting Future Returns Better than the Price to Earnings (PE) Ratio

Authors: Lindrianasari Stefanie, Aminah Khairudin

Abstract:

This study aims to provide empirical evidence regarding the ability of Price to Earnings Ratio and PEG Ratio in predicting future stock returns issuers. The samples used in this study are stocks that go into LQ45. The main contribution is to assign empirical evidence if the PEG Ratio can provide optimum return compared to Price to Earnings Ratio. This study used a sample of the entire company into the group LQ45 with the period of observation. The data used is limited to the financial statements of a company incorporated in LQ45 period July 2013-July 2014, using the financial statements and the position of the company's closing stock price at the end of 2010 as a reference benchmark for the growth of the company's stock price compared to the closing price of 2013. This study found that the method of PEG Ratio can outperform the method of PE ratio in predicting future returns on the stock portfolio of LQ45.

Keywords: price to earnings growth, price to earnings ratio, future returns, stock price

Procedia PDF Downloads 393
3402 Understanding Personal Well-Being among Entrepreneurial Breadwinners: Bibliographic and Empirical Analyses of Relative Resource Theory

Authors: E. Fredrick Rice

Abstract:

Over the past three decades, a substantial body of academic literature has asserted that the pressure to maintain household income can negatively affect the personal well-being of breadwinners. Given that scholars have failed to thoroughly explore this phenomenon with breadwinners who are also business owners, theory has been underdeveloped in the entrepreneurial context. To identify the most appropriate theories to apply to entrepreneurs, the current paper utilized two approaches. First, a comprehensive bibliographic analysis was conducted focusing on works at the intersection of breadwinner status and well-being. Co-authorship and journal citation patterns highlighted relative resource theory as a boundary spanning approach with promising applications in the entrepreneurial space. To build upon this theory, regression analysis was performed using data from the Panel Study of Entrepreneurial Dynamics (PSED). Empirical results showed evidence for the effects of breadwinner status and household income on entrepreneurial well-being. Further, the findings suggest that it is not merely income or job status that predicts well-being, but one’s relative financial contribution compared to that of one’s non-breadwinning organizationally employed partner. This paper offers insight into how breadwinner status can be studied in relation to the entrepreneurial personality.

Keywords: breadwinner, entrepreneurship, household income, well-being.

Procedia PDF Downloads 148
3401 Nonparametric Truncated Spline Regression Model on the Data of Human Development Index in Indonesia

Authors: Kornelius Ronald Demu, Dewi Retno Sari Saputro, Purnami Widyaningsih

Abstract:

Human Development Index (HDI) is a standard measurement for a country's human development. Several factors may have influenced it, such as life expectancy, gross domestic product (GDP) based on the province's annual expenditure, the number of poor people, and the percentage of an illiterate people. The scatter plot between HDI and the influenced factors show that the plot does not follow a specific pattern or form. Therefore, the HDI's data in Indonesia can be applied with a nonparametric regression model. The estimation of the regression curve in the nonparametric regression model is flexible because it follows the shape of the data pattern. One of the nonparametric regression's method is a truncated spline. Truncated spline regression is one of the nonparametric approach, which is a modification of the segmented polynomial functions. The estimator of a truncated spline regression model was affected by the selection of the optimal knots point. Knot points is a focus point of spline truncated functions. The optimal knots point was determined by the minimum value of generalized cross validation (GCV). In this article were applied the data of Human Development Index with a truncated spline nonparametric regression model. The results of this research were obtained the best-truncated spline regression model to the HDI's data in Indonesia with the combination of optimal knots point 5-5-5-4. Life expectancy and the percentage of an illiterate people were the significant factors depend to the HDI in Indonesia. The coefficient of determination is 94.54%. This means the regression model is good enough to applied on the data of HDI in Indonesia.

Keywords: generalized cross validation (GCV), Human Development Index (HDI), knots point, nonparametric regression, truncated spline

Procedia PDF Downloads 311
3400 Numerical Modeling of Geogrid Reinforced Soil Bed under Strip Footings Using Finite Element Analysis

Authors: Ahmed M. Gamal, Adel M. Belal, S. A. Elsoud

Abstract:

This article aims to study the effect of reinforcement inclusions (geogrids) on the sand dunes bearing capacity under strip footings. In this research experimental physical model was carried out to study the effect of the first geogrid reinforcement depth (u/B), the spacing between the reinforcement (h/B) and its extension relative to the footing length (L/B) on the mobilized bearing capacity. This paper presents the numerical modeling using the commercial finite element package (PLAXIS version 8.2) to simulate the laboratory physical model, studying the same parameters previously handled in the experimental work (u/B, L/B & h/B) for the purpose of validation. In this study the soil, the geogrid, the interface element and the boundary condition are discussed with a set of finite element results and the validation. Then the validated FEM used for studying real material and dimensions of strip foundation. Based on the experimental and numerical investigation results, a significant increase in the bearing capacity of footings has occurred due to an appropriate location of the inclusions in sand. The optimum embedment depth of the first reinforcement layer (u/B) is equal to 0.25. The optimum spacing between each successive reinforcement layer (h/B) is equal to 0.75 B. The optimum Length of the reinforcement layer (L/B) is equal to 7.5 B. The optimum number of reinforcement is equal to 4 layers. The study showed a directly proportional relation between the number of reinforcement layer and the Bearing Capacity Ratio BCR, and an inversely proportional relation between the footing width and the BCR.

Keywords: reinforced soil, geogrid, sand dunes, bearing capacity

Procedia PDF Downloads 392
3399 Optimized Text Summarization Model on Mobile Screens for Sight-Interpreters: An Empirical Study

Authors: Jianhua Wang

Abstract:

To obtain key information quickly from long texts on small screens of mobile devices, sight-interpreters need to establish optimized summarization model for fast information retrieval. Four summarization models based on previous studies were studied including title+key words (TKW), title+topic sentences (TTS), key words+topic sentences (KWTS) and title+key words+topic sentences (TKWTS). Psychological experiments were conducted on the four models for three different genres of interpreting texts to establish the optimized summarization model for sight-interpreters. This empirical study shows that the optimized summarization model for sight-interpreters to quickly grasp the key information of the texts they interpret is title+key words (TKW) for cultural texts, title+key words+topic sentences (TKWTS) for economic texts and topic sentences+key words (TSKW) for political texts.

Keywords: different genres, mobile screens, optimized summarization models, sight-interpreters

Procedia PDF Downloads 290
3398 The Mediating Effect of Individual Readiness for Change in the Relationship between Organisational Culture and Individual Commitment to Change

Authors: Mohamed Haffar, Lois Farquharson, Gbola Gbadamosi, Wafi Al-Karaghouli, Ramadane Djbarni

Abstract:

A few recent research studies and mostly conceptual in nature have paid attention to the relationship between organizational culture (OC), individual readiness for change (IRFC) and individual affective commitment to change (IACC). Surprisingly enough, there is a lack of empirical studies investigating the influence of all four OC types on IRFC and IACC. Moreover, there is a very limited research investigating the mediating role of individual readiness for change between OC types and individual affective commitment to change. Therefore, this study is proposed to fill this gap by providing empirical evidence leading to advancement in the understanding of direct and indirect influences of OC on individual affective commitment to change. To achieve this, a questionnaire based survey was developed and self-administered to 226 middle managers in Algerian manufacturing organizations (AMOs). The results of this study indicated that group culture and adhocracy culture positively affect the IACC. Furthermore, the findings of this study show support for the mediating roles of self-efficacy and personally valence in the relationship between OC and IACC.

Keywords: individual readiness for change, individual commitment to change, organisational culture, manufacturing organisations

Procedia PDF Downloads 488
3397 Cultural Heritage, War and Heritage Legislations: An Empirical Review

Authors: Gebrekiros Welegebriel Asfaw

Abstract:

The conservation of cultural heritage during times of war is a topic of significant importance and concern in the field of heritage studies. The destruction, looting, and illicit acts against cultural heritages have devastating consequences. International and national legislations have been put in place to address these issues and provide a legal framework for protecting cultural heritage during armed conflicts. Thus, the aim of this review is to examine the existing heritage legislations and evaluate their effectiveness in protecting cultural heritage during times of war with a special insight of the Tigray war. The review is based on a comprehensive empirical analysis of existing heritage legislations related to the protection of cultural heritage during war, with a special focus on the Tigray war. The review reveals that there are several international and national legislations in place to protect cultural heritage during times of war. However, the implementation of these legislations has been insufficient and ineffective in the case of the Tigray war. The priceless cultural heritages in Tigray, which were once the centers of investment and world pride were, have been subjected to destruction, looting, and other illicit acts, in violation of both international conventions such as the UNESCO Convention and national legislations. Therefore, there is a need for consistent intervention and enforcement of different legislations from the international community and organizations to rehabilitate, repatriate, and reinstitute the irreplaceable heritages of Tigray.

Keywords: cultural heritage, heritage legislations, tigray, war

Procedia PDF Downloads 119
3396 Assessment of Artists’ Socioeconomic and Working Conditions: The Empirical Case of Lithuania

Authors: Rusne Kregzdaite, Erika Godlevska, Morta Vidunaite

Abstract:

The main aim of this research is to explore existing methodologies for artists’ labour force and create artists’ socio-economic and creative conditions in an assessment model. Artists have dual aims in their creative working process: 1) income and 2) artistic self-expression. The valuation of their conditions takes into consideration both sides: the factors related to income and the satisfaction of the creative process and its result. The problem addressed in the study: tangible and intangible artists' criteria used for assessments creativity conditions. The proposed model includes objective factors (working time, income, etc.) and subjective factors (salary covering essential needs, self-satisfaction). Other intangible indicators are taken into account: the impact on the common culture, social values, and the possibility to receive awards, to represent the country in the international market. The empirical model consists of 59 separate indicators, grouped into eight categories. The deviation of each indicator from the general evaluation allows for identifying the strongest and the weakest components of artists’ conditions.

Keywords: artist conditions, artistic labour force, cultural policy, indicator, assessment model

Procedia PDF Downloads 125
3395 Aggregate Fluctuations and the Global Network of Input-Output Linkages

Authors: Alexander Hempfing

Abstract:

The desire to understand business cycle fluctuations, trade interdependencies and co-movement has a long tradition in economic thinking. From input-output economics to business cycle theory, researchers aimed to find appropriate answers from an empirical as well as a theoretical perspective. This paper empirically analyses how the production structure of the global economy and several states developed over time, what their distributional properties are and if there are network specific metrics that allow identifying structurally important nodes, on a global, national and sectoral scale. For this, the World Input-Output Database was used, and different statistical methods were applied. Empirical evidence is provided that the importance of the Eastern hemisphere in the global production network has increased significantly between 2000 and 2014. Moreover, it was possible to show that the sectoral eigenvector centrality indices on a global level are power-law distributed, providing evidence that specific national sectors exist which are more critical to the world economy than others while serving as a hub within the global production network. However, further findings suggest, that global production cannot be characterized as a scale-free network.

Keywords: economic integration, industrial organization, input-output economics, network economics, production networks

Procedia PDF Downloads 246
3394 Indoor Visible Light Communication Channel Characterization for User Mobility: A Use-Case Study

Authors: Pooja Sanathkumar, Srinidhi Murali, Sethuraman TV, Saravanan M, Paventhan Arumugam, Ashwin Ashok

Abstract:

The last decade has witnessed a significant interest in visible light communication (VLC) technology, as VLC can potentially achieve high data rate links and secure communication channels. However, the use of VLC under mobile settings is fundamentally limited as its a line-of-sight (LOS) technology and there has been limited breakthroughs in realizing VLC for mobile settings. In this regard, this work targets to study the VLC channel under mobility. Through a use-case study analysis with experiment data traces this paper presents an empirical VLC channel study considering the application of VLC for smart lighting in an indoor room environment. This paper contributes a calibration study of a prototype VLC smart lighting system in an indoor environment and through the inferences gained from the calibration, and considering a user is carrying a mobile device fit with a VLC receiver, this work presents recommendations for user's position adjustments, with the goal to ensure maximum connectivity across the room.

Keywords: visible light communication, mobility, empirical study, channel characterization

Procedia PDF Downloads 112
3393 Developing a Model of Teaching Writing Based On Reading Approach through Reflection Strategy for EFL Students of STKIP YPUP

Authors: Eny Syatriana, Ardiansyah

Abstract:

The purpose of recent study was to develop a learning model on writing, based on the reading texts which will be read by the students using reflection strategy. The strategy would allow the students to read the text and then they would write back the main idea and to develop the text by using their own sentences. So, the writing practice was begun by reading an interesting text, then the students would develop the text which has been read into their writing. The problem questions are (1) what kind of learning model that can develop the students writing ability? (2) what is the achievement of the students of STKIP YPUP through reflection strategy? (3) is the using of the strategy effective to develop students competence In writing? (4) in what level are the students interest toward the using of a strategy In writing subject? This development research consisted of some steps, they are (1) need analysis (2) model design (3) implementation (4) model evaluation. The need analysis was applied through discussion among the writing lecturers to create a learning model for writing subject. To see the effectiveness of the model, an experiment would be delivered for one class. The instrument and learning material would be validated by the experts. In every steps of material development, there was a learning process, where would be validated by an expert. The research used development design. These Principles and procedures or research design and development .This study, researcher would do need analysis, creating prototype, content validation, and limited empiric experiment to the sample. In each steps, there should be an assessment and revision to the drafts before continue to the next steps. The second year, the prototype would be tested empirically to four classes in STKIP YPUP for English department. Implementing the test greatly was done through the action research and followed by evaluation and validation from the experts.

Keywords: learning model, reflection, strategy, reading, writing, development

Procedia PDF Downloads 345
3392 Simulating the Dynamics of E-waste Production from Mobile Phone: Model Development and Case Study of Rwanda

Authors: Rutebuka Evariste, Zhang Lixiao

Abstract:

Mobile phone sales and stocks showed an exponential growth in the past years globally and the number of mobile phones produced each year was surpassing one billion in 2007, this soaring growth of related e-waste deserves sufficient attentions paid to it regionally and globally as long as 40% of its total weight is made from metallic which 12 elements are identified to be highly hazardous and 12 are less harmful. Different research and methods have been used to estimate the obsolete mobile phones but none has developed a dynamic model and handle the discrepancy resulting from improper approach and error in the input data. The study aim was to develop a comprehensive dynamic system model for simulating the dynamism of e-waste production from mobile phone regardless the country or region and prevail over the previous errors. The logistic model method combined with STELLA program has been used to carry out this study. Then the simulation for Rwanda has been conducted and compared with others countries’ results as model testing and validation. Rwanda is about 1.5 million obsoletes mobile phone with 125 tons of waste in 2014 with e-waste production peak in 2017. It is expected to be 4.17 million obsoletes with 351.97 tons by 2020 along with environmental impact intensity of 21times to 2005. Thus, it is concluded through the model testing and validation that the present dynamic model is competent and able deal with mobile phone e-waste production the fact that it has responded to the previous studies questions from Czech Republic, Iran, and China.

Keywords: carrying capacity, dematerialization, logistic model, mobile phone, obsolescence, similarity, Stella, system dynamics

Procedia PDF Downloads 322
3391 Role of Empirical Evidence in Law-Making: Case Study from India

Authors: Kaushiki Sanyal, Rajesh Chakrabarti

Abstract:

In India, on average, about 60 Bills are passed every year in both Houses of Parliament – Lok Sabha and Rajya Sabha (calculated from information on websites of both Houses). These are debated in both Lok Sabha (House of Commons) and Rajya Sabha (Council of States) before they are passed. However, lawmakers rarely use empirical evidence to make a case for a law. Most of the time, they support a law on the basis of anecdote, intuition, and common sense. While these do play a role in law-making, without the necessary empirical evidence, laws often fail to achieve their desired results. The quality of legislative debates is an indicator of the efficacy of the legislative process through which a Bill is enacted. However, the study of legislative debates has not received much attention either in India or worldwide due to the difficulty of objectively measuring the quality of a debate. Broadly, three approaches have emerged in the study of legislative debates. The rational-choice or formal approach shows that speeches vary based on different institutional arrangements, intra-party politics, and the political culture of a country. The discourse approach focuses on the underlying rules and conventions and how they impact the content of the debates. The deliberative approach posits that legislative speech can be reasoned, respectful, and informed. This paper aims to (a) develop a framework to judge the quality of debates by using the deliberative approach; (b) examine the legislative debates of three Bills passed in different periods as a demonstration of the framework, and (c) examine the broader structural issues that disincentive MPs from scrutinizing Bills. The framework would include qualitative and quantitative indicators to judge a debate. The idea is that the framework would provide useful insights into the legislators’ knowledge of the subject, the depth of their scrutiny of Bills, and their inclination toward evidence-based research. The three Bills that the paper plans to examine are as follows: 1. The Narcotics Drugs and Psychotropic Substances Act, 1985: This act was passed to curb drug trafficking and abuse. However, it mostly failed to fulfill its purpose. Consequently, it was amended thrice but without much impact on the ground. 2. The Criminal Laws (Amendment) Act, 2013: This act amended the Indian Penal Code to add a section on human trafficking. The purpose was to curb trafficking and penalise traffickers, pimps, and middlemen. However, the crime rate remains high while the conviction rate is low. 3. The Surrogacy (Regulation) Act, 2021: This act bans commercial surrogacy allowing only relatives to act as surrogates as long as there is no monetary payment. Experts fear that instead of preventing commercial surrogacy, it would drive the activity underground. The consequences would be borne by the surrogate, who would not be protected by law. The purpose of the paper is to objectively analyse the quality of parliamentary debates, get insights into how MPs understand the evidence and deliberate on steps to incentivise them to use empirical evidence.

Keywords: legislature, debates, empirical, India

Procedia PDF Downloads 67
3390 Content-Based Mammograms Retrieval Based on Breast Density Criteria Using Bidimensional Empirical Mode Decomposition

Authors: Sourour Khouaja, Hejer Jlassi, Nadia Feddaoui, Kamel Hamrouni

Abstract:

Most medical images, and especially mammographies, are now stored in large databases. Retrieving a desired image is considered of great importance in order to find previous similar cases diagnosis. Our method is implemented to assist radiologists in retrieving mammographic images containing breast with similar density aspect as seen on the mammogram. This is becoming a challenge seeing the importance of density criteria in cancer provision and its effect on segmentation issues. We used the BEMD (Bidimensional Empirical Mode Decomposition) to characterize the content of images and Euclidean distance measure similarity between images. Through the experiments on the MIAS mammography image database, we confirm that the results are promising. The performance was evaluated using precision and recall curves comparing query and retrieved images. Computing recall-precision proved the effectiveness of applying the CBIR in the large mammographic image databases. We found a precision of 91.2% for mammography with a recall of 86.8%.

Keywords: BEMD, breast density, contend-based, image retrieval, mammography

Procedia PDF Downloads 212
3389 Modelling and Optimisation of Floating Drum Biogas Reactor

Authors: L. Rakesh, T. Y. Heblekar

Abstract:

This study entails the development and optimization of a mathematical model for a floating drum biogas reactor from first principles using thermal and empirical considerations. The model was derived on the basis of mass conservation, lumped mass heat transfer formulations and empirical biogas formation laws. The treatment leads to a system of coupled nonlinear ordinary differential equations whose solution mapped four-time independent controllable parameters to five output variables which adequately serve to describe the reactor performance. These equations were solved numerically using fourth order Runge-Kutta method for a range of input parameter values. Using the data so obtained an Artificial Neural Network with a single hidden layer was trained using Levenberg-Marquardt Damped Least Squares (DLS) algorithm. This network was then fine-tuned for optimal mapping by varying hidden layer size. This fast forward model was then employed as a health score generator in the Bacterial Foraging Optimization code. The optimal operating state of the simplified Biogas reactor was thus obtained.

Keywords: biogas, floating drum reactor, neural network model, optimization

Procedia PDF Downloads 123
3388 Automation of Savitsky's Method for Power Calculation of High Speed Vessel and Generating Empirical Formula

Authors: M. Towhidur Rahman, Nasim Zaman Piyas, M. Sadiqul Baree, Shahnewaz Ahmed

Abstract:

The design of high-speed craft has recently become one of the most active areas of naval architecture. Speed increase makes these vehicles more efficient and useful for military, economic or leisure purpose. The planing hull is designed specifically to achieve relatively high speed on the surface of the water. Speed on the water surface is closely related to the size of the vessel and the installed power. The Savitsky method was first presented in 1964 for application to non-monohedric hulls and for application to stepped hulls. This method is well known as a reliable comparative to CFD analysis of hull resistance. A computer program based on Savitsky’s method has been developed using MATLAB. The power of high-speed vessels has been computed in this research. At first, the program reads some principal parameters such as displacement, LCG, Speed, Deadrise angle, inclination of thrust line with respect to keel line etc. and calculates the resistance of the hull using empirical planning equations of Savitsky. However, some functions used in the empirical equations are available only in the graphical form, which is not suitable for the automatic computation. We use digital plotting system to extract data from nomogram. As a result, value of wetted length-beam ratio and trim angle can be determined directly from the input of initial variables, which makes the power calculation automated without manually plotting of secondary variables such as p/b and other coefficients and the regression equations of those functions are derived by using data from different charts. Finally, the trim angle, mean wetted length-beam ratio, frictional coefficient, resistance, and power are computed and compared with the results of Savitsky and good agreement has been observed.

Keywords: nomogram, planing hull, principal parameters, regression

Procedia PDF Downloads 382
3387 Using Open Source Data and GIS Techniques to Overcome Data Deficiency and Accuracy Issues in the Construction and Validation of Transportation Network: Case of Kinshasa City

Authors: Christian Kapuku, Seung-Young Kho

Abstract:

An accurate representation of the transportation system serving the region is one of the important aspects of transportation modeling. Such representation often requires developing an abstract model of the system elements, which also requires important amount of data, surveys and time. However, in some cases such as in developing countries, data deficiencies, time and budget constraints do not always allow such accurate representation, leaving opportunities to assumptions that may negatively affect the quality of the analysis. With the emergence of Internet open source data especially in the mapping technologies as well as the advances in Geography Information System, opportunities to tackle these issues have raised. Therefore, the objective of this paper is to demonstrate such application through a practical case of the development of the transportation network for the city of Kinshasa. The GIS geo-referencing was used to construct the digitized map of Transportation Analysis Zones using available scanned images. Centroids were then dynamically placed at the center of activities using an activities density map. Next, the road network with its characteristics was built using OpenStreet data and other official road inventory data by intersecting their layers and cleaning up unnecessary links such as residential streets. The accuracy of the final network was then checked, comparing it with satellite images from Google and Bing. For the validation, the final network was exported into Emme3 to check for potential network coding issues. Results show a high accuracy between the built network and satellite images, which can mostly be attributed to the use of open source data.

Keywords: geographic information system (GIS), network construction, transportation database, open source data

Procedia PDF Downloads 145
3386 The Determinants of Enterprise Risk Management: Literature Review, and Future Research

Authors: Sylvester S. Horvey, Jones Mensah

Abstract:

The growing complexities and dynamics in the business environment have led to a new approach to risk management, known as enterprise risk management (ERM). ERM is a system and an approach to managing the risks of an organization in an integrated manner to achieve the corporate goals and strategic objectives. Regardless of the diversities in the business environment, ERM has become an essential factor in managing individual and business risks because ERM is believed to enhance shareholder value and firm growth. Despite the growing number of literature on ERM, the question about what factors drives ERM remains limited. This study provides a comprehensive literature review of the main factors that contribute to ERM implementation. Google Scholar was the leading search engine used to identify empirical literature, and the review spanned between 2000 and 2020. Articles published in Scimago journal ranking and Scopus were examined. Thirteen firm characteristics and sixteen articles were considered for the empirical review. Most empirical studies agreed that firm size, institutional ownership, industry type, auditor type, industrial diversification, earnings volatility, stock price volatility, and internal auditor had a positive relationship with ERM adoption, whereas firm size, institutional ownership, auditor type, and type of industry were mostly seen be statistically significant. Other factors such as financial leverage, profitability, asset opacity, international diversification, and firm complexity revealed an inconclusive result. The growing literature on ERM is not without limitations; hence, this study suggests that further research should examine ERM determinants within a new geographical context while considering a new and robust way of measuring ERM rather than relying on a simple proxy (dummy) for ERM measurement. Other firm characteristics such as organizational culture and context, corporate scandals and losses, and governance could be considered determinants of ERM adoption.

Keywords: enterprise risk management, determinants, ERM adoption, literature review

Procedia PDF Downloads 146