Search results for: equivalent linear model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19015

Search results for: equivalent linear model

11695 Detecting Model Financial Statement Fraud by Auditor Industry Specialization with Fraud Triangle Analysis

Authors: Reskino Resky

Abstract:

This research purposes to create a model to detecting financial statement fraud. This research examines the variable of fraud triangle and auditor industry specialization with financial statement fraud. This research used sample of company which is listed in Indonesian Stock Exchange that have sanctions and cases by Financial Services Authority in 2011-2013. The number of company that were became in this research were 30 fraud company and 30 non-fraud company. The method of determining the sample is by using purposive sampling method with judgement sampling, while the data processing methods used by researcher are mann-whitney u and discriminants analysis. This research have two from five variable that can be process with discriminant analysis. The result shows the financial targets can be detect financial statement fraud, while financial stability can’t be detect financial statement fraud.

Keywords: fraud triangle analysis, financial targets, financial stability, auditor industry specialization, financial statement fraud

Procedia PDF Downloads 445
11694 Modelling of Polymeric Fluid Flows between Two Coaxial Cylinders Taking into Account the Heat Dissipation

Authors: Alexander Blokhin, Ekaterina Kruglova, Boris Semisalov

Abstract:

Mathematical model based on the mesoscopic theory of polymer dynamics is developed for numerical simulation of the flows of polymeric liquid between two coaxial cylinders. This model is a system of nonlinear partial differential equations written in the cylindrical coordinate system and coupled with the heat conduction equation including a specific dissipation term. The stationary flows similar to classical Poiseuille ones are considered, and the resolving equations for the velocity of flow and for the temperature are obtained. For solving them, a fast pseudospectral method is designed based on Chebyshev approximations, that enables one to simulate the flows through the channels with extremely small relative values of the radius of inner cylinder. The numerical analysis of the dependance of flow on this radius and on the values of dissipation constant is done.

Keywords: dynamics of polymeric liquid, heat dissipation, singularly perturbed problem, pseudospectral method, Chebyshev polynomials, stabilization technique

Procedia PDF Downloads 278
11693 Numerical Simulation of Large-Scale Landslide-Generated Impulse Waves With a Soil‒Water Coupling Smooth Particle Hydrodynamics Model

Authors: Can Huang, Xiaoliang Wang, Qingquan Liu

Abstract:

Soil‒water coupling is an important process in landslide-generated impulse waves (LGIW) problems, accompanied by large deformation of soil, strong interface coupling and three-dimensional effect. A meshless particle method, smooth particle hydrodynamics (SPH) has great advantages in dealing with complex interface and multiphase coupling problems. This study presents an improved soil‒water coupled model to simulate LGIW problems based on an open source code DualSPHysics (v4.0). Aiming to solve the low efficiency problem in modeling real large-scale LGIW problems, graphics processing unit (GPU) acceleration technology is implemented into this code. An experimental example, subaerial landslide-generated water waves, is simulated to demonstrate the accuracy of this model. Then, the Huangtian LGIW, a real large-scale LGIW problem is modeled to reproduce the entire disaster chain, including landslide dynamics, fluid‒solid interaction, and surge wave generation. The convergence analysis shows that a particle distance of 5.0 m can provide a converged landslide deposit and surge wave for this example. Numerical simulation results are in good agreement with the limited field survey data. The application example of the Huangtian LGIW provides a typical reference for large-scale LGIW assessments, which can provide reliable information on landslide dynamics, interface coupling behavior, and surge wave characteristics.

Keywords: soil‒water coupling, landslide-generated impulse wave, large-scale, SPH

Procedia PDF Downloads 47
11692 Extreme Temperature Forecast in Mbonge, Cameroon Through Return Level Analysis of the Generalized Extreme Value (GEV) Distribution

Authors: Nkongho Ayuketang Arreyndip, Ebobenow Joseph

Abstract:

In this paper, temperature extremes are forecast by employing the block maxima method of the generalized extreme value (GEV) distribution to analyse temperature data from the Cameroon Development Corporation (CDC). By considering two sets of data (raw data and simulated data) and two (stationary and non-stationary) models of the GEV distribution, return levels analysis is carried out and it was found that in the stationary model, the return values are constant over time with the raw data, while in the simulated data the return values show an increasing trend with an upper bound. In the non-stationary model, the return levels of both the raw data and simulated data show an increasing trend with an upper bound. This clearly shows that although temperatures in the tropics show a sign of increase in the future, there is a maximum temperature at which there is no exceedance. The results of this paper are very vital in agricultural and environmental research.

Keywords: forecasting, generalized extreme value (GEV), meteorology, return level

Procedia PDF Downloads 457
11691 Numerical Investigation of Heat Transfer in Laser Irradiated Biological Samplebased on Dual-Phase-Lag Heat Conduction Model Using Lattice Boltzmann Method

Authors: Shashank Patidar, Sumit Kumar, Atul Srivastava, Suneet Singh

Abstract:

Present work is concerned with the numerical investigation of thermal response of biological tissues during laser-based photo-thermal therapy for destroying cancerous/abnormal cells with minimal damage to the surrounding normal cells. Light propagation through the biological sample is mathematically modelled by transient radiative transfer equation. In the present work, application of the Lattice Boltzmann Method is extended to analyze transport of short-pulse radiation in a participating medium.In order to determine the two-dimensional temperature distribution inside the tissue medium, the RTE has been coupled with Penne’s bio-heat transfer equation based on Fourier’s law by several researchers in last few years.

Keywords: lattice Boltzmann method, transient radiation transfer equation, dual phase lag model

Procedia PDF Downloads 333
11690 Subjective versus Objective Assessment for Magnetic Resonance (MR) Images

Authors: Heshalini Rajagopal, Li Sze Chow, Raveendran Paramesran

Abstract:

Magnetic Resonance Imaging (MRI) is one of the most important medical imaging modality. Subjective assessment of the image quality is regarded as the gold standard to evaluate MR images. In this study, a database of 210 MR images which contains ten reference images and 200 distorted images is presented. The reference images were distorted with four types of distortions: Rician Noise, Gaussian White Noise, Gaussian Blur and DCT compression. The 210 images were assessed by ten subjects. The subjective scores were presented in Difference Mean Opinion Score (DMOS). The DMOS values were compared with four FR-IQA metrics. We have used Pearson Linear Coefficient (PLCC) and Spearman Rank Order Correlation Coefficient (SROCC) to validate the DMOS values. The high correlation values of PLCC and SROCC shows that the DMOS values are close to the objective FR-IQA metrics.

Keywords: medical resonance (MR) images, difference mean opinion score (DMOS), full reference image quality assessment (FR-IQA)

Procedia PDF Downloads 446
11689 Survival Analysis of Identifying the Risk Factors of Affecting the First Recurrence Time of Breast Cancer: The Case of Tigray, Ethiopia

Authors: Segen Asayehegn

Abstract:

Introduction: In Tigray, Ethiopia, next to cervical cancer, breast cancer is one of the most common cancer health problems for women. Objectives: This article is proposed to identify the prospective and potential risk factors affecting the time-to-first-recurrence of breast cancer patients in Tigray, Ethiopia. Methods: The data were taken from the patient’s medical record that registered from January 2010 to January 2020. The study considered a sample size of 1842 breast cancer patients. Powerful non-parametric and parametric shared frailty survival regression models (FSRM) were applied, and model comparisons were performed. Results: Out of 1842 breast cancer patients, about 1290 (70.02%) recovered/cured the disease. The median cure time from breast cancer is found at 12.8 months. The model comparison suggested that the lognormal parametric shared a frailty survival regression model predicted that treatment, stage of breast cancer, smoking habit, and marital status significantly affects the first recurrence of breast cancer. Conclusion: Factors like treatment, stages of cancer, and marital status were improved while smoking habits worsened the time to cure breast cancer. Recommendation: Thus, the authors recommend reducing breast cancer health problems, the regional health sector facilities need to be improved. More importantly, concerned bodies and medical doctors should emphasize the identified factors during treatment. Furthermore, general awareness programs should be given to the community on the identified factors.

Keywords: acceleration factor, breast cancer, Ethiopia, shared frailty survival models, Tigray

Procedia PDF Downloads 121
11688 Red Meat Price Volatility and Its' Relationship with Crude Oil and Exchange Rate

Authors: Melek Akay

Abstract:

Turkey's agricultural commodity prices are prone to fluctuation but have gradually over time. A considerable amount of literature examines the changes in these prices by dealing with other commodities such as energy. Links between agricultural and energy markets have therefore been extensively investigated. Since red meat prices are becoming increasingly volatile in Turkey, this paper analyses the price volatility of veal, lamb and the relationship between red meat and crude oil, exchange rates by applying the generalize all period unconstraint volatility model, which generalises the GARCH (p, q) model for analysing weekly data covering a period of May 2006 to February 2017. Empirical results show that veal and lamb prices present volatility during the last decade, but particularly between 2009 and 2012. Moreover, oil prices have a significant effect on veal and lamb prices as well as their previous periods. Consequently, our research can lead policy makers to evaluate policy implementation in the appropriate way and reduce the impacts of oil prices by supporting producers.

Keywords: red meat price, volatility, crude oil, exchange rates, GARCH models, Turkey

Procedia PDF Downloads 113
11687 Dynamic Environmental Impact Study during the Construction of the French Nuclear Power Plants

Authors: A. Er-Raki, D. Hartmann, J. P. Belaud, S. Negny

Abstract:

This paper has a double purpose: firstly, a literature review of the life cycle analysis (LCA) and secondly a comparison between conventional (static) LCA and multi-level dynamic LCA on the following items: (i) inventories evolution with time (ii) temporal evolution of the databases. The first part of the paper summarizes the state of the art of the static LCA approach. The different static LCA limits have been identified and especially the non-consideration of the spatial and temporal evolution in the inventory, for the characterization factors (FCs) and into the databases. Then a description of the different levels of integration of the notion of temporality in life cycle analysis studies was made. In the second part, the dynamic inventory has been evaluated firstly for a single nuclear plant and secondly for the entire French nuclear power fleet by taking into account the construction durations of all the plants. In addition, the databases have been adapted by integrating the temporal variability of the French energy mix. Several iterations were used to converge towards the real environmental impact of the energy mix. Another adaptation of the databases to take into account the temporal evolution of the market data of the raw material was made. An identification of the energy mix of the time studied was based on an extrapolation of the production reference values of each means of production. An application to the construction of the French nuclear power plants from 1971 to 2000 has been performed, in which a dynamic inventory of raw material has been evaluated. Then the impacts were characterized by the ILCD 2011 characterization method. In order to compare with a purely static approach, a static impact assessment was made with the V 3.4 Ecoinvent data sheets without adaptation and a static inventory considering that all the power stations would have been built at the same time. Finally, a comparison between static and dynamic LCA approaches was set up to determine the gap between them for each of the two levels of integration. The results were analyzed to identify the contribution of the evolving nuclear power fleet construction to the total environmental impacts of the French energy mix during the same period. An equivalent strategy using a dynamic approach will further be applied to identify the environmental impacts that different scenarios of the energy transition could bring, allowing to choose the best energy mix from an environmental viewpoint.

Keywords: LCA, static, dynamic, inventory, construction, nuclear energy, energy mix, energy transition

Procedia PDF Downloads 90
11686 Social Work Practice to Labour Welfare: A Proposed Model of Field Work Practicum and Role of Social Worker in India

Authors: Naeem Ahmed

Abstract:

Social work is a professional activity based on the approach of “helping people to help themselves” (Stroup). Social work education and practice both are based on humanitarian philosophy in which social workers try to increase the happiness of the society and to reduce the problems of society. Labour welfare is a specialised field of social work which especially focuses on welfare of organised and unorganised labour. In India labour is facing numerous problems in both organised and unorganised sectors because of ignorance, illiteracy, high rate of unemployment etc. In most of the Indian social work institutions we have this specialization with different names like Human Resource Management or Industrial Relation and Personnel Management or Industrial Relations and Labour Welfare or Industrial Social Work etc. Field work practice is integrated part of social work education curriculum in all specialised field. In India we have different field work practice models being followed in different institutions. The main objective of this paper is to prepare a universal field work practicum model in the field of labour welfare. This paper is exploratory in nature, researcher used personal experience and secondary data (model of field work practice in different institutions like Aligarh Muslim University, Pondicherry University, Central University of Karnataka, University of Lucknow, MJP Rohilkhand University Bareilly etc.) Researcher found that there is an immediate need to upgrade the curriculum or field work practice in this particular field, as more than 40 percent of total population engaged in either unorganised or organised sector (NSSO 2011-12) and they are not aware about their rights. In this way a social worker can play an important role in existing labour welfare facilities by making them aware.

Keywords: field work, labour welfare, organised labour, social work practice, unorganised labour

Procedia PDF Downloads 385
11685 Using Data Mining Techniques to Evaluate the Different Factors Affecting the Academic Performance of Students at the Faculty of Information Technology in Hashemite University in Jordan

Authors: Feras Hanandeh, Majdi Shannag

Abstract:

This research studies the different factors that could affect the Faculty of Information Technology in Hashemite University students’ accumulative average. The research paper verifies the student information, background, their academic records, and how this information will affect the student to get high grades. The student information used in the study is extracted from the student’s academic records. The data mining tools and techniques are used to decide which attribute(s) will affect the student’s accumulative average. The results show that the most important factor which affects the students’ accumulative average is the student Acceptance Type. And we built a decision tree model and rules to determine how the student can get high grades in their courses. The overall accuracy of the model is 44% which is accepted rate.

Keywords: data mining, classification, extracting rules, decision tree

Procedia PDF Downloads 401
11684 An Assessment of Experiential Learning Outcomes of Study Abroad Programs in Hospitality: A Learning Style Perspective

Authors: Radesh Palakurthi

Abstract:

The purpose of this study was to determine the impact of experiential learning on learning outcomes in hospitality education. This paper presents the results of an online survey of students from the U.S. studying abroad and their self-reported change in learning outcomes as assessed using the Core Competencies Model for the Hospitality Industry developed by Employment and Training Development Office of the U.S. Department of Labor. The impact of student learning styles on learning outcomes is also evaluated in this study. Kolb’s Learning Styles Inventory Model was used to assess students’ learning style. The results show that students reported significant improvements in their learning outcomes because of engaging in study abroad experiential learning programs. The learning styles of the students had significant effect on one of core learning outcomes- personal effectiveness.

Keywords: hospitality competencies, hospitality education, Kolb’s learning style inventory, learning outcomes, study abroad

Procedia PDF Downloads 206
11683 Seismic Response Analysis of Frame Structures Based on Super Joint Element Model

Authors: Li Xu, Yang Hong, T. Zhao Wen

Abstract:

Experimental results of many RC beam-column subassemblage indicate that slippage of longitudinal beam rebar within the joint and the shear deformation of joint core have significant influence on seismic behavior of the subassemblage. However, rigid joint assumption has been generally used in the seismic response analysis of RC frames, in which two kinds of inelastic deformation of joint have been ignored. Based on OpenSees platform, ‘Super Joint Element Model’ with more detailed inelastic mechanism is used to simulate the inelastic response of joints. Two finite element models of typical RC plane frame, namely considering or ignoring the inelastic deformation of joint respectively, were established and analyzed under seven strong earthquake waves. The simulated global and local inelastic deformations of the RC plane frame is shown and discussed. The analyses also confirm the security of the earthquake-resistant frame designed according to Chinese codes.

Keywords: frame structure, beam-column joint, longitudinal bar slippage, shear deformation, nonlinear analysis

Procedia PDF Downloads 394
11682 A Study of Islamic Stock Indices and Macroeconomic Variables

Authors: Mohammad Irfan

Abstract:

The purpose of this paper is to investigate the relationship among the key macroeconomic variables and Islamic stock market in India. This study is based on the time series data of financial years 2009-2015 to explore the consistency of relationship between macroeconomic variables and Shariah Indices. The ADF (Augmented Dickey–Fuller Test Statistic) and PP (Phillips–Perron Test Statistic) tests are employed to check stationarity of the data. The study depicts the long run relationship between Shariah indices and macroeconomic variables by using the Johansen Co-integration test. BSE Shariah and Nifty Shariah have uni-direct Granger causality. The outcome of VECM is significantly confirming the applicability of best fitted model. Thus, Islamic stock indices are proficiently working for the development of Indian economy. It suggests that by keeping eyes on Islamic stock market which will be more interactive in the future with other macroeconomic variables.

Keywords: Indian Shariah Indices, macroeconomic variables, co-integration, Granger causality, vector error correction model (VECM)

Procedia PDF Downloads 268
11681 Numerical Solution of a Mathematical Model of Vortex Using Projection Method: Applications to Tornado Dynamics

Authors: Jagdish Prasad Maurya, Sanjay Kumar Pandey

Abstract:

Inadequate understanding of the complex nature of flow features in tornado vortex is a major problem in modelling tornadoes. Tornadoes are violent atmospheric phenomenon that appear all over the world. Modelling tornadoes aim to reduce the loss of the human lives and material damage caused by the tornadoes. Dynamics of tornado is investigated by a numerical technique, the improved version of the projection method. In this paper, authors solve the problem for axisymmetric tornado vortex by the said method that uses a finite difference approach for getting an accurate and stable solution. The conclusions drawn are that large radial inflow velocity occurs near the ground that leads to increase the tangential velocity. The increased velocity phenomenon occurs close to the boundary and absolute maximum wind is obtained near the vortex core. The results validate previous numerical and theoretical models.

Keywords: computational fluid dynamics, mathematical model, Navier-Stokes equations, tornado

Procedia PDF Downloads 338
11680 Smart Kids Coacher: Model for Childhood Obesity in Thailand

Authors: Pornwipa Daoduong, Jairak Loysongkroa, Napaphan Viriyautsahakul, Wachira Pengjuntr

Abstract:

Obesity is on of serious health problem in many countries including Thailand where the prevalence of childhood obesity has increased from 8.8 % in 2014 to 9.5 % in 2015 and 12.9 % in 2016. The Ministry of Public Health’s objective is to reduce prevalence of childhood Obesity to 10% or lower in 2017, by implementing the measure in relation to nutrition, physical activity (PA) and environment in 6,405 targeted school with proportion of school children with obesity is higher than 10 %. Smart Kids Coacher (SKC)” is a new innovative intervention created by Department of Health and consists of 252 regional and provincial officers. The SKC aims to train the super trainers about food and nutrition.PA and emotional control through implementing three learning activities including 1) Food for Fun is about Nutrition flag, Nutrition label, food portion and Nutrition surveillance; 2) Fun for Fit includes intermediated- and advanced level workouts within 60 minutes such as kangaroo dance, Chair stretching; and 3) Control emotional is about to prevent probability of access to unhealthy food, to ensure for having meal in appropriate time, and to recruit peers and family member to increase awareness among target groups. Apart from providing SKC lesson for 3,828 officers at district level, a number of students (2,176) as role model are selected through implementing “Smart Kids Leader: (SKL)”.Consequently. The SKC lowers proportion of childhood obesity from 17% in 2012 to 12.9% in 2016. Further, the SKC coverage should be expanded to other setting. Policy maker should be aware of the important of reduction of the prevalence of childhood obesity, and it’s related risk. Network and Collaboration between stakeholders are essential as well as an improvement of holistic intervention and knowledge “NuPETHS” for kids in the future.

Keywords: childhood obesity, model, obesity, smart kids coacher

Procedia PDF Downloads 217
11679 Key Performance Indicators and the Model for Achieving Digital Inclusion for Smart Cities

Authors: Khalid Obaed Mahmod, Mesut Cevik

Abstract:

The term smart city has appeared recently and was accompanied by many definitions and concepts, but as a simplified and clear definition, it can be said that the smart city is a geographical location that has gained efficiency and flexibility in providing public services to citizens through its use of technological and communication technologies, and this is what distinguishes it from other cities. Smart cities connect the various components of the city through the main and sub-networks in addition to a set of applications and thus be able to collect data that is the basis for providing technological solutions to manage resources and provide services. The basis of the work of the smart city is the use of artificial intelligence and the technology of the Internet of Things. The work presents the concept of smart cities, the pillars, standards, and evaluation indicators on which smart cities depend, and the reasons that prompted the world to move towards its establishment. It also provides a simplified hypothetical way to measure the ideal smart city model by defining some indicators and key pillars, simulating them with logic circuits, and testing them to determine if the city can be considered an ideal smart city or not.

Keywords: factors, indicators, logic gates, pillars, smart city

Procedia PDF Downloads 128
11678 Proactive Competence Management for Employees: A Bottom-up Process Model for Developing Target Competence Profiles Based on the Employee's Tasks

Authors: Maximilian Cedzich, Ingo Dietz Von Bayer, Roland Jochem

Abstract:

In order for industrial companies to continue to succeed in dynamic, globalized markets, they must be able to train their employees in an agile manner and at short notice in line with the exogenous conditions that arise. For this purpose, it is indispensable to operate a proactive competence management system for employees that recognizes qualification needs timely in order to be able to address them promptly through qualification measures. However, there are hardly any approaches to be found in the literature that includes systematic, proactive competence management. In order to help close this gap, this publication presents a process model that systematically develops bottom-up, future-oriented target competence profiles based on the tasks of the employees. Concretely, in the first step, the tasks of the individual employees are examined for assumed future conditions. In other words, qualitative scenarios are considered for the individual tasks to determine how they are likely to change. In a second step, these scenario-based future tasks are translated into individual future-related target competencies of the employee using a matrix of generic task properties. The final step pursues the goal of validating the target competence profiles formed in this way within the framework of a management workshop. This process model provides industrial companies with a tool that they can use to determine the competencies required by their own employees in the future and compare them with the actual prevailing competencies. If gaps are identified between the target and the actual, these qualification requirements can be closed in the short term by means of qualification measures.

Keywords: dynamic globalized markets, employee competence management, industrial companies, knowledge management

Procedia PDF Downloads 180
11677 Bi-Objective Optimization for Sustainable Supply Chain Network Design in Omnichannel

Authors: Veerpaul Maan, Gaurav Mishra

Abstract:

The evolution of omnichannel has revolutionized the supply chain of the organizations by enhancing customer shopping experience. For these organizations need to develop well-integrated multiple distribution channels to leverage the benefits of omnichannel. To adopt an omnichannel system in the supply chain has resulted in structuring and reconfiguring the practices of the traditional supply chain distribution network. In this paper a multiple distribution supply chain network (MDSCN) have been proposed which integrates online giants with a local retailers distribution network in uncertain environment followed by sustainability. To incorporate sustainability, an additional objective function is added to reduce the carbon content through minimizing the travel distance of the product. Through this proposed model, customers are free to access product and services as per their choice of channels which increases their convenience, reach and satisfaction. Further, a numerical illustration is being shown along with interpretation of results to validate the proposed model.

Keywords: sustainable supply chain network, omnichannel, multiple distribution supply chain network, integrate multiple distribution channels

Procedia PDF Downloads 208
11676 An Elaboration Likelihood Model to Evaluate Consumer Behavior on Facebook Marketplace: Trust on Seller as a Moderator

Authors: Sharmistha Chowdhury, Shuva Chowdhury

Abstract:

Buying-selling new as well as second-hand goods like tools, furniture, household, electronics, clothing, baby stuff, vehicles, and hobbies through the Facebook marketplace has become a new paradigm for c2c sellers. This phenomenon encourages and empowers decentralised home-oriented sellers. This study adopts Elaboration Likelihood Model (ELM) to explain consumer behaviour on Facebook Marketplace (FM). ELM suggests that consumers process information through the central and peripheral routes, which eventually shape their attitudes towards posts. The central route focuses on information quality, and the peripheral route focuses on cues. Sellers’ FM posts usually include product features, prices, conditions, pictures, and pick-up location. This study uses information relevance and accuracy as central route factors. The post’s attractiveness represents cues and creates positive or negative associations with the product. A post with remarkable pictures increases the attractiveness of the post. So, post aesthetics is used as a peripheral route factor. People influenced via the central or peripheral route forms an attitude that includes multiple processes – response and purchase intention. People respond to FM posts through save, share and chat. Purchase intention reflects a positive image of the product and higher purchase intention. This study proposes trust on sellers as a moderator to test the strength of its influence on consumer attitudes and behaviour. Trust on sellers is assessed whether sellers have badges or not. A sample questionnaire will be developed and distributed among a group of random FM sellers who are selling vehicles on this platform to conduct the study. The chosen product of this study is the vehicle, a high-value purchase item. High-value purchase requires consumers to consider forming their attitude without any sign of impulsiveness seriously. Hence, vehicles are the perfect choice to test the strength of consumers attitudes and behaviour. The findings of the study add to the elaboration likelihood model and online second-hand marketplace literature.

Keywords: consumer behaviour, elaboration likelihood model, facebook marketplace, c2c marketing

Procedia PDF Downloads 122
11675 A Preliminary Study of the Reconstruction of Urban Residential Public Space in the Context of the “Top-down” Construction Model in China: Based on Research of TianZiFang District in Shanghai and Residential Space in Hangzhou

Authors: Wang Qiaowei, Gao Yujiang

Abstract:

With the economic growth and rapid urbanization after the reform and openness, some of China's fast-growing cities have demolished former dwellings and built modern residential quarters. The blind, incomplete reference to western modern cities and the one-off construction lacking feedback mechanism have intensified such phenomenon, causing the citizen gradually expanded their living scale with the popularization of car traffic, and the peer-to-peer lifestyle gradually settled. The construction of large-scale commercial centers has caused obstacles to small business around the residential areas, leading to space for residents' interaction has been compressed. At the same time, the advocated Central Business District (CBD) model even leads to the unsatisfactory reconstruction of many historical blocks such as the Hangzhou Southern Song Dynasty Imperial Street. However, the popularity of historical spaces such as Wuzhen and Hongcun also indicates the collective memory and needs of the street space for Chinese residents. The evolution of Shanghai TianZiFang also proves the importance of the motivation of space participants in space construction in the context of the “top-down” construction model in China. In fact, there are frequent occurrences of “reconstruction”, which may redefine the space, in various residential areas. If these activities can be selectively controlled and encouraged, it will be beneficial to activate the public space as well as the residents’ intercourse, so that the traditional Chinese street space can be reconstructed in the context of modern cities.

Keywords: rapid urbanization, traditional street space, space re-construction, bottom-up design

Procedia PDF Downloads 97
11674 The Economics of Justice as Fairness

Authors: Antonio Abatemarco, Francesca Stroffolini

Abstract:

In the economic literature, Rawls’ Theory of Justice is usually interpreted in a two-stage setting, where a priority to the worst off individual is imposed as a distributive value judgment. In this paper, instead, we model Rawls’ Theory in a three-stage setting, that is, a separating line is drawn between the original position, the educational stage, and the working life. Hence, in this paper, we challenge the common interpretation of Rawls’ Theory of Justice as Fairness by showing that this Theory goes well beyond the definition of a distributive value judgment, in such a way as to embrace efficiency issues as well. In our model, inequalities are shown to be permitted as far as they stimulate a greater effort in education in the population, and so economic growth. To our knowledge, this is the only possibility for the inequality to be ‘bought’ by both the most-, and above all, the least-advantaged individual as suggested by the Difference Principle. Finally, by recalling the old tradition of ‘universal ex-post efficiency’, we show that a unique optimal social contract does not exist behind the veil of ignorance; more precisely, the sole set of potentially Rawls-optimal social contracts can be identified a priori, and partial justice orderings derived accordingly.

Keywords: justice, Rawls, inequality, social contract

Procedia PDF Downloads 208
11673 Structural Investigation of Na2O–B2O3–SiO2 Glasses Doped with NdF3

Authors: M. S. Gaafar, S. Y. Marzouk

Abstract:

Sodium borosilicate glasses doped with different content of NdF3 mol % have been prepared by rapid quenching method. Ultrasonic velocities (both longitudinal and shear) measurements have been carried out at room temperature and at ultrasonic frequency of 4 MHz. Elastic moduli, Debye temperature, softening temperature and Poisson's ratio have been obtained as a function of NdF3 modifier content. Results showed that the elastic moduli, Debye temperature, softening temperature and Poisson's ratio have very slight change with the change of NdF3 mol % content. Based on FTIR spectroscopy and theoretical (Bond compression) model, quantitative analysis has been carried out in order to obtain more information about the structure of these glasses. The study indicated that the structure of these glasses is mainly composed of SiO4 units with four bridging oxygens (Q4), and with three bridging and one nonbridging oxygens (Q3).

Keywords: borosilicate glasses, ultrasonic velocity, elastic moduli, FTIR spectroscopy, bond compression model

Procedia PDF Downloads 391
11672 Sparsity Order Selection and Denoising in Compressed Sensing Framework

Authors: Mahdi Shamsi, Tohid Yousefi Rezaii, Siavash Eftekharifar

Abstract:

Compressed sensing (CS) is a new powerful mathematical theory concentrating on sparse signals which is widely used in signal processing. The main idea is to sense sparse signals by far fewer measurements than the Nyquist sampling rate, but the reconstruction process becomes nonlinear and more complicated. Common dilemma in sparse signal recovery in CS is the lack of knowledge about sparsity order of the signal, which can be viewed as model order selection procedure. In this paper, we address the problem of sparsity order estimation in sparse signal recovery. This is of main interest in situations where the signal sparsity is unknown or the signal to be recovered is approximately sparse. It is shown that the proposed method also leads to some kind of signal denoising, where the observations are contaminated with noise. Finally, the performance of the proposed approach is evaluated in different scenarios and compared to an existing method, which shows the effectiveness of the proposed method in terms of order selection as well as denoising.

Keywords: compressed sensing, data denoising, model order selection, sparse representation

Procedia PDF Downloads 466
11671 Model-Based Process Development for the Comparison of a Radial Riveting and Roller Burnishing Process in Mechanical Joining Technology

Authors: Tobias Beyer, Christoph Friedrich

Abstract:

Modern simulation methodology using finite element models is nowadays a recognized tool for product design/optimization. Likewise, manufacturing process design is increasingly becoming the focus of simulation methodology in order to enable sustainable results based on reduced real-life tests here as well. In this article, two process simulations -radial riveting and roller burnishing- used for mechanical joining of components are explained. In the first step, the required boundary conditions are developed and implemented in the respective simulation models. This is followed by process space validation. With the help of the validated models, the interdependencies of the input parameters are investigated and evaluated by means of sensitivity analyses. Limit case investigations are carried out and evaluated with the aid of the process simulations. Likewise, a comparison of the two joining methods to each other becomes possible.

Keywords: FEM, model-based process development, process simulation, radial riveting, roller burnishing, sensitivity analysis

Procedia PDF Downloads 92
11670 Transformational Leadership Style and Organizational Commitment: An Empirical Assessment

Authors: Ugochukwu D. Abasilim, Aize I. Obayan, Adedayo J. Odukoya, Godwyns Agube, Power A. I. Wogu, Nchekwube Excellence-Oluye

Abstract:

This paper examines the effect of transformational leadership style on organizational commitment among Private University employees in Nigeria. A quantitative methodology was adopted for this study. A structured Multi-factor Leadership Questionnaire (MLQ) developed by Bass and Avolio (1997) and Organizational Commitment Questionnaire (OCQ) developed by Meyer and Allen (1997) were the major instruments used for data collection. Simple linear regression was used for testing the hypothesis. The results indicated that there was no significant positive effect of transformational leadership style on organizational commitment among employees of the Nigerian private university studied. Though the respondents rated their leaders high on transformational leadership style, their organizational commitment rating was average for majority, which implies that employees’ level of commitment could be accounted for by transformational leadership style existing in the institution. This finding is antithetical to the common submission in literature that transformational leadership style has a significant effect on organizational commitment. It was therefore recommended that further studies may want to further explore the reasons for this variance.

Keywords: leadership style, Nigeria, organizational, commitment, transformational leadership

Procedia PDF Downloads 406
11669 Groundhog Day as a Model for the Repeating Spectator and the Film Academic: Re-Watching the Same Films Again Can Create Different Experiences and Ideas

Authors: Leiya Ho Yin Lee

Abstract:

Groundhog Day (Harold Ramis, 1993) may seemingly be a fairly unremarkable Hollywood comedy film in the 90s, it is argued that the film, with its protagonist Phil (Bill Murray), inadvertently, but perfectly, demonstrates an important aspect in filmmaking, film spectatorship and film research: repetition. Very rarely does a narrative film use one, and only one, take in its shooting. The multiple ‘repeats’ of Phil’s various endeavours due to his being trapped in a perpetual loop of the same day — from stealing money and tricking a woman into a casual relationship, to his multiple suicides, to eventually helping people in need — make the process of doing multiple ‘takes’ in filmmaking explicit. But perhaps more significantly, Phil represents a perfect model for the spectator/cinephile who has seen their favourite film for multiple times that they can remember every single detail. Crucially, their favourite film never changes, as it is a recording, but the cinephile’s experience of that very same film is most likely different each time they watch it again, just as Phil’s character and personality has completely transformed, from selfish and egotistic, to depressed and nihilistic, and ultimately to sympathetic and caring, even though he is living the exact same day. Furthermore, the author did not come up with this stimulating juxtaposition of film spectatorship and Groundhog Day the first time the author saw the film; it took the author a few casual re-viewings to notice the film’s self-reflexivity. And then, when working on it in the author’s research, the author had to re-view the film for more times, and have subsequently noticed even more things previously unnoticed. In this way, Groundhog Day not only stands for a model for filmmaking and film spectatorship, it also illustrates the act of academic research, especially in Film Studies where repeatedly viewing the same films is a prerequisite before new ideas and concepts are discovered from old material. This also recalls Deleuze’s thesis on difference and repetition in that repetition creates difference and it is difference that creates thought.

Keywords: narrative comprehension, repeated viewing, repetition, spectatorship

Procedia PDF Downloads 308
11668 Development of a Turbulent Boundary Layer Wall-pressure Fluctuations Power Spectrum Model Using a Stepwise Regression Algorithm

Authors: Zachary Huffman, Joana Rocha

Abstract:

Wall-pressure fluctuations induced by the turbulent boundary layer (TBL) developed over aircraft are a significant source of aircraft cabin noise. Since the power spectral density (PSD) of these pressure fluctuations is directly correlated with the amount of sound radiated into the cabin, the development of accurate empirical models that predict the PSD has been an important ongoing research topic. The sound emitted can be represented from the pressure fluctuations term in the Reynoldsaveraged Navier-Stokes equations (RANS). Therefore, early TBL empirical models (including those from Lowson, Robertson, Chase, and Howe) were primarily derived by simplifying and solving the RANS for pressure fluctuation and adding appropriate scales. Most subsequent models (including Goody, Efimtsov, Laganelli, Smol’yakov, and Rackl and Weston models) were derived by making modifications to these early models or by physical principles. Overall, these models have had varying levels of accuracy, but, in general, they are most accurate under the specific Reynolds and Mach numbers they were developed for, while being less accurate under other flow conditions. Despite this, recent research into the possibility of using alternative methods for deriving the models has been rather limited. More recent studies have demonstrated that an artificial neural network model was more accurate than traditional models and could be applied more generally, but the accuracy of other machine learning techniques has not been explored. In the current study, an original model is derived using a stepwise regression algorithm in the statistical programming language R, and TBL wall-pressure fluctuations PSD data gathered at the Carleton University wind tunnel. The theoretical advantage of a stepwise regression approach is that it will automatically filter out redundant or uncorrelated input variables (through the process of feature selection), and it is computationally faster than machine learning. The main disadvantage is the potential risk of overfitting. The accuracy of the developed model is assessed by comparing it to independently sourced datasets.

Keywords: aircraft noise, machine learning, power spectral density models, regression models, turbulent boundary layer wall-pressure fluctuations

Procedia PDF Downloads 125
11667 Belt Conveyor Dynamics in Transient Operation for Speed Control

Authors: D. He, Y. Pang, G. Lodewijks

Abstract:

Belt conveyors play an important role in continuous dry bulk material transport, especially at the mining industry. Speed control is expected to reduce the energy consumption of belt conveyors. Transient operation is the operation of increasing or decreasing conveyor speed for speed control. According to literature review, current research rarely takes the conveyor dynamics in transient operation into account. However, in belt conveyor speed control, the conveyor dynamic behaviors are significantly important since the poor dynamics might result in risks. In this paper, the potential risks in transient operation will be analyzed. An existing finite element model will be applied to build a conveyor model, and simulations will be carried out to analyze the conveyor dynamics. In order to realize the soft speed regulation, Harrison’s sinusoid acceleration profile will be applied, and Lodewijks estimator will be built to approximate the required acceleration time. A long inclined belt conveyor will be studied with two major simulations. The conveyor dynamics will be given.

Keywords: belt conveyor , speed control, transient operation, dynamics

Procedia PDF Downloads 311
11666 Aspen Plus Simulation of Saponification of Ethyl Acetate in the Presence of Sodium Hydroxide in a Plug Flow Reactor

Authors: U. P. L. Wijayarathne, K. C. Wasalathilake

Abstract:

This work presents the modelling and simulation of saponification of ethyl acetate in the presence of sodium hydroxide in a plug flow reactor using Aspen Plus simulation software. Plug flow reactors are widely used in the industry due to the non-mixing property. The use of plug flow reactors becomes significant when there is a need for continuous large scale reaction or fast reaction. Plug flow reactors have a high volumetric unit conversion as the occurrence for side reactions is minimum. In this research Aspen Plus V8.0 has been successfully used to simulate the plug flow reactor. In order to simulate the process as accurately as possible HYSYS Peng-Robinson EOS package was used as the property method. The results obtained from the simulation were verified by the experiment carried out in the EDIBON plug flow reactor module. The correlation coefficient (r2) was 0.98 and it proved that simulation results satisfactorily fit for the experimental model. The developed model can be used as a guide for understanding the reaction kinetics of a plug flow reactor.

Keywords: aspen plus, modelling, plug flow reactor, simulation

Procedia PDF Downloads 582