Search results for: MATLAB reference model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18716

Search results for: MATLAB reference model

9536 Complicating Representations of Domestic Violence Perpetration through a Qualitative Content Analysis and Socio-Ecological Approach

Authors: Charlotte Lucke

Abstract:

This study contributes to the body of literature that analyzes and complicates oversimplified and sensationalized representations of trauma and violence through a close examination and complication of representations of perpetrators of domestic violence in the mass media. This study determines the ways the media frames perpetrators of domestic violence through a qualitative content analysis and socio-ecological approach to the perpetration of violence. While the qualitative analysis has not been carried out, through preliminary research, this study hypothesizes that the media represents perpetrators through tropes such as the 'predator' or 'offender,' or as a demonized 'other.' It is necessary to expose and work through such stereotypes because cultivation theory demonstrates that the mass media determines societal beliefs about and perceptions of the world. Thus, representations of domestic violence in the mass media can lead people to believe that perpetrators of violence are mere animals or criminals and overlook the trauma that many perpetrators experience. When the media represents perpetrators as pure evil, monsters, or absolute 'others,' it leaves out the complexities of what moves people to commit domestic violence. By analyzing and placing media representations of perpetrators into conversation with the socio-ecological approach to violence perpetration, this study complicates domestic violence stereotypes. The socio-ecological model allows researchers to consider the way the interplay between individuals and their families, friends, communities, and cultures can move people to act violently. Using this model, along with psychological and psychoanalytic approaches to the etiology of domestic violence, this paper argues that media stereotypes conceal the way people’s experiences of trauma, along with community and cultural norms, perpetuates the cycle of systemic trauma and violence in the home.

Keywords: domestic violence, media images, representing trauma, theorising trauma

Procedia PDF Downloads 220
9535 Role of Maternal Astaxanthin Supplementation on Brain Derived Neurotrophic Factor and Spatial Learning Behavior in Wistar Rat Offspring’s

Authors: K. M. Damodara Gowda

Abstract:

Background: Maternal health and nutrition are considered as the predominant factors influencing brain functional development. If the mother is free of illness and genetic defects, maternal nutrition would be one of the most critical factors affecting the brain development. Calorie restrictions cause significant impairment in spatial learning ability and the levels of Brain Derived Neurotrophic Factor (BDNF) in rats. But, the mechanism by which the prenatal under-nutrition leads to impairment in brain learning and memory function is still unclear. In the present study, prenatal Astaxanthin supplementation on BDNF level, spatial learning and memory performance in the offspring’s of normal, calorie restricted and Astaxanthin supplemented rats was investigated. Methodology: The rats were administered with 6mg and 12 mg of astaxanthin /kg bw for 21 days following which acquisition and retention of spatial memory was tested in a partially-baited eight arm radial maze. The BDNF level in different regions of the brain (cerebral cortex, hippocampus and cerebellum) was estimated by ELISA method. Results: Calorie restricted animals treated with astaxanthin made significantly more correct choices (P < 0.05), and fewer reference memory errors (P < 0.05) on the tenth day of training compared to offsprings of calorie restricted animals. Calorie restricted animals treated with astaxanthin also made significantly higher correct choices (P < 0.001) than untreated calorie restricted animals in a retention test 10 days after the training period. The mean BDNF level in cerebral cortex, Hippocampus and cerebellum in Calorie restricted animals treated with astaxanthin didnot show significant variation from that of control animals. Conclusion: Findings of the study indicated that memory and learning was impaired in the offspring’s of calorie restricted rats which was effectively modulated by astaxanthin at the dosage of 12 mg/kg body weight. In the same way the BDNF level at cerebral cortex, Hippocampus and Cerebellum was also declined in the offspring’s of calorie restricted animals, which was also found to be effectively normalized by astaxanthin.

Keywords: calorie restiction, learning, Memory, Cerebral cortex, Hippocampus, Cerebellum, BDNF, Astaxanthin

Procedia PDF Downloads 219
9534 Informal Carers in Telemonitoring of Users with Pacemakers: Characteristics, Time of Services Provided and Costs

Authors: Antonio Lopez-Villegas, Rafael Bautista-Mesa, Emilio Robles-Musso, Daniel Catalan-Matamoros, Cesar Leal-Costa

Abstract:

Objectives: The purpose of this trial was to evaluate the burden borne by and the costs to informal caregivers of users with telemonitoring of pacemakers. Methods: This is a controlled, non-randomised clinical trial, with data collected from informal caregivers, five years after implantation of pacemakers. The Spanish version of the Survey on Disabilities, Personal Autonomy, and Dependency Situations was used to get information on clinical and social characteristics, levels of professionalism, duration and types of care, difficulties in providing care, health status, economic and job aspects, impact on the family or leisure due to informal caregiving for patients with pacemakers. Results: After five years of follow-up, 55 users with pacemakers finished the study. Of which, 50 were helped by a caregiver, 18 were included in the telemonitoring group (TM) and 32 in the conventional follow-up group (HM). Overall, females represented 96.0% of the informal caregivers (88.89% in TM and 100.0% in HM group). The mean ages were 63.17 ± 15.92 and 63.13 ± 14.56 years, respectively (p = 0.83) in the groups. The majority (88.0%) of the caregivers declared that they had to provide their services between 6 and 7 days per week (83.33% in TM group versus 90.63% in HM group), without significant differences between both groups. The costs related to care provided by the informal caregivers were 47.04% higher in the conventional follow-up group than in the TM group. Conclusions: The results of this trial confirm that there were no significant differences between the informal caregivers regarding to baseline characteristics, workload and time worked in both groups of follow-up. The costs incurred by the informal caregivers providing care for users with pacemakers included in telemonitoring group are significantly lower than those in the conventional follow-up group. Trial registration: ClinicalTrials.gov NCT02234245. Funding: The PONIENTE study, has been funded by the General Secretariat for Research, Development and Innovation, Regional Government of Andalusia (Spain), project reference number PI/0256/2017, under the research call 'Development and Innovation Projects in the Field of Biomedicine and Health Sciences', 2017.

Keywords: costs, disease burden, informal caregiving, pacemaker follow-up, remote monitoring, telemedicine

Procedia PDF Downloads 126
9533 Electronic Spectral Function of Double Quantum Dots–Superconductors Nanoscopic Junction

Authors: Rajendra Kumar

Abstract:

We study the Electronic spectral density of a double coupled quantum dots sandwich between superconducting leads, where one of the superconducting leads (QD1) are connected with left superconductor lead and (QD1) also connected right superconductor lead. (QD1) and (QD2) are coupling to each other. The electronic spectral density through a quantum dots between superconducting leads having s-wave symmetry of the superconducting order parameter. Such junction is called superconducting –quantum dot (S-QD-S) junction. For this purpose, we have considered a renormalized Anderson model that includes the double coupled of the superconducting leads with the quantum dots level and an attractive BCS-type effective interaction in superconducting leads. We employed the Green’s function technique to obtain superconducting order parameter with the BCS framework and Ambegaoker-Baratoff formalism to analyze the electronic spectral density through such (S-QD-S) junction. It has been pointed out that electronic spectral density through such a junction is dominated by the attractive the paring interaction in the leads, energy of the level on the dot with respect to Fermi energy and also on the coupling parameter of the two in an essential way. On the basis of numerical analysis we have compared the theoretical results of electronic spectral density with the recent transport existing theoretical analysis. QDs is the charging energy that may give rise to effects based on the interplay of Coulomb repulsion and superconducting correlations. It is, therefore, an interesting question to ask how the discrete level spectrum and the charging energy affect the DC and AC Josephson transport between two superconductors coupled via a QD. In the absence of a bias voltage, a finite DC current can be sustained in such an S-QD-S by the DC Josephson effect.

Keywords: quantum dots, S-QD-S junction, BCS superconductors, Anderson model

Procedia PDF Downloads 360
9532 Assessment of Time-variant Work Stress for Human Error Prevention

Authors: Hyeon-Kyo Lim, Tong-Il Jang, Yong-Hee Lee

Abstract:

For an operator in a nuclear power plant, human error is one of the most dreaded factors that may result in unexpected accidents. The possibility of human errors may be low, but the risk of them would be unimaginably enormous. Thus, for accident prevention, it is quite indispensable to analyze the influence of any factors which may raise the possibility of human errors. During the past decades, not a few research results showed that performance of human operators may vary over time due to lots of factors. Among them, stress is known to be an indirect factor that may cause human errors and result in mental illness. Until now, not a few assessment tools have been developed to assess stress level of human workers. However, it still is questionable to utilize them for human performance anticipation which is related with human error possibility, because they were mainly developed from the viewpoint of mental health rather than industrial safety. Stress level of a person may go up or down with work time. In that sense, if they would be applicable in the safety aspect, they should be able to assess the variation resulted from work time at least. Therefore, this study aimed to compare their applicability for safety purpose. More than 10 kinds of work stress tools were analyzed with reference to assessment items, assessment and analysis methods, and follow-up measures which are known to close related factors with work stress. The results showed that most tools mainly focused their weights on some common organizational factors such as demands, supports, and relationships, in sequence. Their weights were broadly similar. However, they failed to recommend practical solutions. Instead, they merely advised to set up overall counterplans in PDCA cycle or risk management activities which would be far from practical human error prevention. Thus, it was concluded that application of stress assessment tools mainly developed for mental health seemed to be impractical for safety purpose with respect to human performance anticipation, and that development of a new assessment tools would be inevitable if anyone wants to assess stress level in the aspect of human performance variation and accident prevention. As a consequence, as practical counterplans, this study proposed a new scheme for assessment of work stress level of a human operator that may vary over work time which is closely related with the possibility of human errors.

Keywords: human error, human performance, work stress, assessment tool, time-variant, accident prevention

Procedia PDF Downloads 657
9531 Development of Methods for Plastic Injection Mold Weight Reduction

Authors: Bita Mohajernia, R. J. Urbanic

Abstract:

Mold making techniques have focused on meeting the customers’ functional and process requirements; however, today, molds are increasing in size and sophistication, and are difficult to manufacture, transport, and set up due to their size and mass. Presently, mold weight saving techniques focus on pockets to reduce the mass of the mold, but the overall size is still large, which introduces costs related to the stock material purchase, processing time for process planning, machining and validation, and excess waste materials. Reducing the overall size of the mold is desirable for many reasons, but the functional requirements, tool life, and durability cannot be compromised in the process. It is proposed to use Finite Element Analysis simulation tools to model the forces, and pressures to determine where the material can be removed. The potential results of this project will reduce manufacturing costs. In this study, a light weight structure is defined by an optimal distribution of material to carry external loads. The optimization objective of this research is to determine methods to provide the optimum layout for the mold structure. The topology optimization method is utilized to improve structural stiffness while decreasing the weight using the OptiStruct software. The optimized CAD model is compared with the primary geometry of the mold from the NX software. Results of optimization show an 8% weight reduction while the actual performance of the optimized structure, validated by physical testing, is similar to the original structure.

Keywords: finite element analysis, plastic injection molding, topology optimization, weight reduction

Procedia PDF Downloads 277
9530 Rayleigh-Bénard-Taylor Convection of Newtonian Nanoliquid

Authors: P. G. Siddheshwar, T. N. Sakshath

Abstract:

In the paper we make linear and non-linear stability analyses of Rayleigh-Bénard convection of a Newtonian nanoliquid in a rotating medium (called as Rayleigh-Bénard-Taylor convection). Rigid-rigid isothermal boundaries are considered for investigation. Khanafer-Vafai-Lightstone single phase model is used for studying instabilities in nanoliquids. Various thermophysical properties of nanoliquid are obtained using phenomenological laws and mixture theory. The eigen boundary value problem is solved for the Rayleigh number using an analytical method by considering trigonometric eigen functions. We observe that the critical nanoliquid Rayleigh number is less than that of the base liquid. Thus the onset of convection is advanced due to the addition of nanoparticles. So, increase in volume fraction leads to advanced onset and thereby increase in heat transport. The amplitudes of convective modes required for estimating the heat transport are determined analytically. The tri-modal standard Lorenz model is derived for the steady state assuming small scale convective motions. The effect of rotation on the onset of convection and on heat transport is investigated and depicted graphically. It is observed that the onset of convection is delayed due to rotation and hence leads to decrease in heat transport. Hence, rotation has a stabilizing effect on the system. This is due to the fact that the energy of the system is used to create the component V. We observe that the amount of heat transport is less in the case of rigid-rigid isothermal boundaries compared to free-free isothermal boundaries.

Keywords: nanoliquid, rigid-rigid, rotation, single phase

Procedia PDF Downloads 216
9529 Assessment of Landfill Pollution Load on Hydroecosystem by Use of Heavy Metal Bioaccumulation Data in Fish

Authors: Gintarė Sauliutė, Gintaras Svecevičius

Abstract:

Landfill leachates contain a number of persistent pollutants, including heavy metals. They have the ability to spread in ecosystems and accumulate in fish which most of them are classified as top-consumers of trophic chains. Fish are freely swimming organisms; but perhaps, due to their species-specific ecological and behavioral properties, they often prefer the most suitable biotopes and therefore, did not avoid harmful substances or environments. That is why it is necessary to evaluate the persistent pollutant dispersion in hydroecosystem using fish tissue metal concentration. In hydroecosystems of hybrid type (e.g. river-pond-river) the distance from the pollution source could be a perfect indicator of such a kind of metal distribution. The studies were carried out in the Kairiai landfill neighboring hybrid-type ecosystem which is located 5 km east of the Šiauliai City. Fish tissue (gills, liver, and muscle) metal concentration measurements were performed on two types of ecologically-different fishes according to their feeding characteristics: benthophagous (Gibel carp, roach) and predatory (Northern pike, perch). A number of mathematical models (linear, non-linear, using log and other transformations) have been applied in order to identify the most satisfactorily description of the interdependence between fish tissue metal concentration and the distance from the pollution source. However, the only one log-multiple regression model revealed the pattern that the distance from the pollution source is closely and positively correlated with metal concentration in all predatory fish tissues studied (gills, liver, and muscle).

Keywords: bioaccumulation in fish, heavy metals, hydroecosystem, landfill leachate, mathematical model

Procedia PDF Downloads 276
9528 Optimized Parameters for Simultaneous Detection of Cd²⁺, Pb²⁺ and CO²⁺ Ions in Water Using Square Wave Voltammetry on the Unmodified Glassy Carbon Electrode

Authors: K. Sruthi, Sai Snehitha Yadavalli, Swathi Gosh Acharyya

Abstract:

Water is the most crucial element for sustaining life on earth. Increasing water pollution directly or indirectly leads to harmful effects on human life. Most of the heavy metal ions are harmful in their cationic form. These heavy metal ions are released by various activities like disposing of batteries, industrial wastes, automobile emissions, and soil contamination. Ions like (Pb, Co, Cd) are carcinogenic and show many harmful effects when consumed more than certain limits proposed by WHO. The simultaneous detection of the heavy metal ions (Pb, Co, Cd), which are highly toxic, is reported in this study. There are many analytical methods for quantifying, but electrochemical techniques are given high priority because of their sensitivity and ability to detect and recognize lower concentrations. Square wave voltammetry was preferred in electrochemical methods due to the absence of background currents which is interference. Square wave voltammetry was performed on GCE for the quantitative detection of ions. Three electrode system consisting of a glassy carbon electrode as the working electrode (3 mm diameter), Ag/Agcl electrode as the reference electrode, and a platinum wire as the counter electrode was chosen for experimentation. The mechanism of detection was done by optimizing the experimental parameters, namely pH, scan rate, and temperature. Under the optimized conditions, square wave voltammetry was performed for simultaneous detection. Scan rates were varied from 5 mV/s to 100 mV/s and found that at 25 mV/s all the three ions were detected simultaneously with proper peaks at particular stripping potential. The variation of pH from 3 to 8 was done where the optimized pH was taken as pH 5 which holds good for three ions. There was a decreasing trend at starting because of hydrogen gas evolution, and after pH 5 again there was a decreasing trend that is because of hydroxide formation on the surface of the working electrode (GCE). The temperature variation from 25˚C to 45˚C was done where the optimum temperature concerning three ions was taken as 35˚C. Deposition and stripping potentials were given as +1.5 V and -1.5 V, and the resting time of 150 seconds was given. Three ions were detected at stripping potentials of Cd²⁺ at -0.84 V, Pb²⁺ at -0.54 V, and Co²⁺ at -0.44 V. The parameters of detection were optimized on a glassy carbon electrode for simultaneous detection of the ions at lower concentrations by square wave voltammetry.

Keywords: cadmium, cobalt, lead, glassy carbon electrode, square wave anodic stripping voltammetry

Procedia PDF Downloads 98
9527 Count of Trees in East Africa with Deep Learning

Authors: Nubwimana Rachel, Mugabowindekwe Maurice

Abstract:

Trees play a crucial role in maintaining biodiversity and providing various ecological services. Traditional methods of counting trees are time-consuming, and there is a need for more efficient techniques. However, deep learning makes it feasible to identify the multi-scale elements hidden in aerial imagery. This research focuses on the application of deep learning techniques for tree detection and counting in both forest and non-forest areas through the exploration of the deep learning application for automated tree detection and counting using satellite imagery. The objective is to identify the most effective model for automated tree counting. We used different deep learning models such as YOLOV7, SSD, and UNET, along with Generative Adversarial Networks to generate synthetic samples for training and other augmentation techniques, including Random Resized Crop, AutoAugment, and Linear Contrast Enhancement. These models were trained and fine-tuned using satellite imagery to identify and count trees. The performance of the models was assessed through multiple trials; after training and fine-tuning the models, UNET demonstrated the best performance with a validation loss of 0.1211, validation accuracy of 0.9509, and validation precision of 0.9799. This research showcases the success of deep learning in accurate tree counting through remote sensing, particularly with the UNET model. It represents a significant contribution to the field by offering an efficient and precise alternative to conventional tree-counting methods.

Keywords: remote sensing, deep learning, tree counting, image segmentation, object detection, visualization

Procedia PDF Downloads 46
9526 An Acerbate Psychotics Symptoms, Social Support, Stressful Life Events, Medication Use Self-Efficacy Impact on Social Dysfunction: A Cross Sectional Self-Rated Study of Persons with Schizophrenia Patient and Misusing Methamphetamines

Authors: Ek-Uma Imkome, Jintana Yunibhand, Waraporn Chaiyawat

Abstract:

Background: Persons with schizophrenia patient and misusing methamphetamines suffering from social dysfunction that impact on their quality of life. Knowledge of factors related to social dysfunction will guide the effective intervention. Objectives: To determine the direct effect, indirect effect and total effect of an acerbate Psychotics’ Symptoms, Social Support, Stressful life events, Medication use self-efficacy impact on social dysfunction in Thai schizophrenic patient and methamphetamine misuse. Methods: Data were collected from schizophrenic and methamphetamine misuse patient by self report. A linear structural relationship was used to test the hypothesized path model. Results: The hypothesized model was found to fit the empirical data and explained 54% of the variance of the psychotic symptoms (X2 = 114.35, df = 92, p-value = 0.05, X2 /df = 1.24, GFI = 0.96, AGFI = 0.92, CFI = 1.00, NFI = 0.99, NNFI = 0.99, RMSEA = 0.02). The highest total effect on social dysfunction was psychotic symptoms (0.67, p<0.05). Medication use self-efficacy had a direct effect on psychotic symptoms (-0.25, p<0.01), and social support had direct effect on medication use self efficacy (0.36, p <0.01). Conclusions: Psychotic symptoms and stressful life events were the significance factors that influenced direct on social dysfunctioning. Therefore, interventions that are designed to manage these factors are crucial in order to enhance social functioning in this population.

Keywords: psychotic symptoms, methamphetamine, schizophrenia, stressful life events, social dysfunction, social support, medication use self efficacy

Procedia PDF Downloads 197
9525 Land Suitability Prediction Modelling for Agricultural Crops Using Machine Learning Approach: A Case Study of Khuzestan Province, Iran

Authors: Saba Gachpaz, Hamid Reza Heidari

Abstract:

The sharp increase in population growth leads to more pressure on agricultural areas to satisfy the food supply. To achieve this, more resources should be consumed and, besides other environmental concerns, highlight sustainable agricultural development. Land-use management is a crucial factor in obtaining optimum productivity. Machine learning is a widely used technique in the agricultural sector, from yield prediction to customer behavior. This method focuses on learning and provides patterns and correlations from our data set. In this study, nine physical control factors, namely, soil classification, electrical conductivity, normalized difference water index (NDWI), groundwater level, elevation, annual precipitation, pH of water, annual mean temperature, and slope in the alluvial plain in Khuzestan (an agricultural hotspot in Iran) are used to decide the best agricultural land use for both rainfed and irrigated agriculture for ten different crops. For this purpose, each variable was imported into Arc GIS, and a raster layer was obtained. In the next level, by using training samples, all layers were imported into the python environment. A random forest model was applied, and the weight of each variable was specified. In the final step, results were visualized using a digital elevation model, and the importance of all factors for each one of the crops was obtained. Our results show that despite 62% of the study area being allocated to agricultural purposes, only 42.9% of these areas can be defined as a suitable class for cultivation purposes.

Keywords: land suitability, machine learning, random forest, sustainable agriculture

Procedia PDF Downloads 66
9524 Effect of Gas Boundary Layer on the Stability of a Radially Expanding Liquid Sheet

Authors: Soumya Kedia, Puja Agarwala, Mahesh Tirumkudulu

Abstract:

Linear stability analysis is performed for a radially expanding liquid sheet in the presence of a gas medium. A liquid sheet can break up because of the aerodynamic effect as well as its thinning. However, the study of the aforementioned effects is usually done separately as the formulation becomes complicated and is difficult to solve. Present work combines both, aerodynamic effect and thinning effect, ignoring the non-linearity in the system. This is done by taking into account the formation of the gas boundary layer whilst neglecting viscosity in the liquid phase. Axisymmetric flow is assumed for simplicity. Base state analysis results in a Blasius-type system which can be solved numerically. Perturbation theory is then applied to study the stability of the liquid sheet, where the gas-liquid interface is subjected to small deformations. The linear model derived here can be applied to investigate the instability for sinuous as well as varicose modes, where the former represents displacement in the centerline of the sheet and the latter represents modulation in sheet thickness. Temporal instability analysis is performed for sinuous modes, which are significantly more unstable than varicose modes, for a fixed radial distance implying local stability analysis. The growth rates, measured for fixed wavenumbers, predicated by the present model are significantly lower than those obtained by the inviscid Kelvin-Helmholtz instability and compare better with experimental results. Thus, the present theory gives better insight into understanding the stability of a thin liquid sheet.

Keywords: boundary layer, gas-liquid interface, linear stability, thin liquid sheet

Procedia PDF Downloads 214
9523 Toward the Decarbonisation of EU Transport Sector: Impacts and Challenges of the Diffusion of Electric Vehicles

Authors: Francesca Fermi, Paola Astegiano, Angelo Martino, Stephanie Heitel, Michael Krail

Abstract:

In order to achieve the targeted emission reductions for the decarbonisation of the European economy by 2050, fundamental contributions are required from both energy and transport sectors. The objective of this paper is to analyse the impacts of a largescale diffusion of e-vehicles, either battery-based or fuel cells, together with the implementation of transport policies aiming at decreasing the use of motorised private modes in order to achieve greenhouse gas emission reduction goals, in the context of a future high share of renewable energy. The analysis of the impacts and challenges of future scenarios on transport sector is performed with the ASTRA (ASsessment of TRAnsport Strategies) model. ASTRA is a strategic system-dynamic model at European scale (EU28 countries, Switzerland and Norway), consisting of different sub-modules related to specific aspects: the transport system (e.g. passenger trips, tonnes moved), the vehicle fleet (composition and evolution of technologies), the demographic system, the economic system, the environmental system (energy consumption, emissions). A key feature of ASTRA is that the modules are linked together: changes in one system are transmitted to other systems and can feed-back to the original source of variation. Thanks to its multidimensional structure, ASTRA is capable to simulate a wide range of impacts stemming from the application of transport policy measures: the model addresses direct impacts as well as second-level and third-level impacts. The simulation of the different scenarios is performed within the REFLEX project, where the ASTRA model is employed in combination with several energy models in a comprehensive Modelling System. From the transport sector perspective, some of the impacts are driven by the trend of electricity price estimated from the energy modelling system. Nevertheless, the major drivers to a low carbon transport sector are policies related to increased fuel efficiency of conventional drivetrain technologies, improvement of demand management (e.g. increase of public transport and car sharing services/usage) and diffusion of environmentally friendly vehicles (e.g. electric vehicles). The final modelling results of the REFLEX project will be available from October 2018. The analysis of the impacts and challenges of future scenarios is performed in terms of transport, environmental and social indicators. The diffusion of e-vehicles produces a consistent reduction of future greenhouse gas emissions, although the decarbonisation target can be achieved only with the contribution of complementary transport policies on demand management and supporting the deployment of low-emission alternative energy for non-road transport modes. The paper explores the implications through time of transport policy measures on mobility and environment, underlying to what extent they can contribute to a decarbonisation of the transport sector. Acknowledgements: The results refer to the REFLEX project which has received grants from the European Union’s Horizon 2020 research and innovation program under Grant Agreement No. 691685.

Keywords: decarbonisation, greenhouse gas emissions, e-mobility, transport policies, energy

Procedia PDF Downloads 138
9522 Influence of Foundation Size on Seismic Response of Mid-rise Buildings Considering Soil-Structure-Interaction

Authors: Quoc Van Nguyen, Behzad Fatahi, Aslan S. Hokmabadi

Abstract:

Performance based seismic design is a modern approach to earthquake-resistant design shifting emphasis from “strength” to “performance”. Soil-Structure Interaction (SSI) can influence the performance level of structures significantly. In this paper, a fifteen storey moment resisting frame sitting on a shallow foundation (footing) with different sizes is simulated numerically using ABAQUS software. The developed three dimensional numerical simulation accounts for nonlinear behaviour of the soil medium by considering the variation of soil stiffness and damping as a function of developed shear strain in the soil elements during earthquake. Elastic-perfectly plastic model is adopted to simulate piles and structural elements. Quiet boundary conditions are assigned to the numerical model and appropriate interface elements, capable of modelling sliding and separation between the foundation and soil elements, are considered. Numerical results in terms of base shear, lateral deformations, and inter-storey drifts of the structure are compared for the cases of soil-structure interaction system with different foundation sizes as well as fixed base condition (excluding SSI). It can be concluded that conventional design procedures excluding SSI may result in aggressive design. Moreover, the size of the foundation can influence the dynamic characteristics and seismic response of the building due to SSI and should therefore be given careful consideration in order to ensure a safe and cost effective seismic design.

Keywords: soil-structure-interaction, seismic response, shallow foundation, abaqus, rayleigh damping

Procedia PDF Downloads 494
9521 Hardness map of Human Tarsals, Meta Tarsals and Phalanges of Toes

Authors: Irfan Anjum Manarvi, Zahid Ali kaimkhani

Abstract:

Predicting location of the fracture in human bones has been a keen area of research for the past few decades. A variety of tests for hardness, deformation, and strain field measurement have been conducted in the past; but considered insufficient due to various limitations. Researchers, therefore, have proposed further studies due to inaccuracies in measurement methods, testing machines, and experimental errors. Advancement and availability of hardware, measuring instrumentation, and testing machines can now provide remedies to these limitations. The human foot is a critical part of the body exposed to various forces throughout its life. A number of products are developed for using it for protection and care, which many times do not provide sufficient protection and may itself become a source of stress due to non-consideration of the delicacy of bones in the feet. A continuous strain or overloading on feet may occur resulting to discomfort and even fracture. Mechanical properties of Tarsals, Metatarsals, and phalanges are, therefore, the primary area of consideration for all such design applications. Hardness is one of the mechanical properties which are considered very important to establish the mechanical resistance behavior of a material against applied loads. Past researchers have worked in the areas of investigating mechanical properties of these bones. However, their results were based on a limited number of experiments and taking average values of hardness due to either limitation of samples or testing instruments. Therefore, they proposed further studies in this area. The present research has been carried out to develop a hardness map of the human foot by measuring micro hardness at various locations of these bones. Results are compiled in the form of distance from a reference point on a bone and the hardness values for each surface. The number of test results is far more than previous studies and are spread over a typical bone to give a complete hardness map of these bones. These results could also be used to establish other properties such as stress and strain distribution in the bones. Also, industrial engineers could use it for design and development of various accessories for human feet health care and comfort and further research in the same areas.

Keywords: tarsals, metatarsals, phalanges, hardness testing, biomechanics of human foot

Procedia PDF Downloads 412
9520 Load Balancing Technique for Energy - Efficiency in Cloud Computing

Authors: Rani Danavath, V. B. Narsimha

Abstract:

Cloud computing is emerging as a new paradigm of large scale distributed computing. Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., three service models, and four deployment networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model is composed of five essential characteristics models. Load balancing is one of the main challenges in cloud computing, which is required to distribute the dynamic workload across multiple nodes, to ensure that no single node is overloaded. It helps in optimal utilization of resources, enhancing the performance of the system. The goal of the load balancing is to minimize the resource consumption and carbon emission rate, that is the direct need of cloud computing. This determined the need of new metrics energy consumption and carbon emission for energy-efficiency load balancing techniques in cloud computing. Existing load balancing techniques mainly focuses on reducing overhead, services, response time and improving performance etc. In this paper we introduced a Technique for energy-efficiency, but none of the techniques have considered the energy consumption and carbon emission. Therefore, our proposed work will go towards energy – efficiency. So this energy-efficiency load balancing technique can be used to improve the performance of cloud computing by balancing the workload across all the nodes in the cloud with the minimum resource utilization, in turn, reducing energy consumption, and carbon emission to an extent, which will help to achieve green computing.

Keywords: cloud computing, distributed computing, energy efficiency, green computing, load balancing, energy consumption, carbon emission

Procedia PDF Downloads 431
9519 Evaluating Traffic Congestion Using the Bayesian Dirichlet Process Mixture of Generalized Linear Models

Authors: Ren Moses, Emmanuel Kidando, Eren Ozguven, Yassir Abdelrazig

Abstract:

This study applied traffic speed and occupancy to develop clustering models that identify different traffic conditions. Particularly, these models are based on the Dirichlet Process Mixture of Generalized Linear regression (DML) and change-point regression (CR). The model frameworks were implemented using 2015 historical traffic data aggregated at a 15-minute interval from an Interstate 295 freeway in Jacksonville, Florida. Using the deviance information criterion (DIC) to identify the appropriate number of mixture components, three traffic states were identified as free-flow, transitional, and congested condition. Results of the DML revealed that traffic occupancy is statistically significant in influencing the reduction of traffic speed in each of the identified states. Influence on the free-flow and the congested state was estimated to be higher than the transitional flow condition in both evening and morning peak periods. Estimation of the critical speed threshold using CR revealed that 47 mph and 48 mph are speed thresholds for congested and transitional traffic condition during the morning peak hours and evening peak hours, respectively. Free-flow speed thresholds for morning and evening peak hours were estimated at 64 mph and 66 mph, respectively. The proposed approaches will facilitate accurate detection and prediction of traffic congestion for developing effective countermeasures.

Keywords: traffic congestion, multistate speed distribution, traffic occupancy, Dirichlet process mixtures of generalized linear model, Bayesian change-point detection

Procedia PDF Downloads 280
9518 Evaluation of Machine Learning Algorithms and Ensemble Methods for Prediction of Students’ Graduation

Authors: Soha A. Bahanshal, Vaibhav Verdhan, Bayong Kim

Abstract:

Graduation rates at six-year colleges are becoming a more essential indicator for incoming fresh students and for university rankings. Predicting student graduation is extremely beneficial to schools and has a huge potential for targeted intervention. It is important for educational institutions since it enables the development of strategic plans that will assist or improve students' performance in achieving their degrees on time (GOT). A first step and a helping hand in extracting useful information from these data and gaining insights into the prediction of students' progress and performance is offered by machine learning techniques. Data analysis and visualization techniques are applied to understand and interpret the data. The data used for the analysis contains students who have graduated in 6 years in the academic year 2017-2018 for science majors. This analysis can be used to predict the graduation of students in the next academic year. Different Predictive modelings such as logistic regression, decision trees, support vector machines, Random Forest, Naïve Bayes, and KNeighborsClassifier are applied to predict whether a student will graduate. These classifiers were evaluated with k folds of 5. The performance of these classifiers was compared based on accuracy measurement. The results indicated that Ensemble Classifier achieves better accuracy, about 91.12%. This GOT prediction model would hopefully be useful to university administration and academics in developing measures for assisting and boosting students' academic performance and ensuring they graduate on time.

Keywords: prediction, decision trees, machine learning, support vector machine, ensemble model, student graduation, GOT graduate on time

Procedia PDF Downloads 61
9517 Automated Building Internal Layout Design Incorporating Post-Earthquake Evacuation Considerations

Authors: Sajjad Hassanpour, Vicente A. González, Yang Zou, Jiamou Liu

Abstract:

Earthquakes pose a significant threat to both structural and non-structural elements in buildings, putting human lives at risk. Effective post-earthquake evacuation is critical for ensuring the safety of building occupants. However, current design practices often neglect the integration of post-earthquake evacuation considerations into the early-stage architectural design process. To address this gap, this paper presents a novel automated internal architectural layout generation tool that optimizes post-earthquake evacuation performance. The tool takes an initial plain floor plan as input, along with specific requirements from the user/architect, such as minimum room dimensions, corridor width, and exit lengths. Based on these inputs, firstly, the tool randomly generates different architectural layouts. Secondly, the human post-earthquake evacuation behaviour will be thoroughly assessed for each generated layout using the advanced Agent-Based Building Earthquake Evacuation Simulation (AB2E2S) model. The AB2E2S prototype is a post-earthquake evacuation simulation tool that incorporates variables related to earthquake intensity, architectural layout, and human factors. It leverages a hierarchical agent-based simulation approach, incorporating reinforcement learning to mimic human behaviour during evacuation. The model evaluates different layout options and provides feedback on evacuation flow, time, and possible casualties due to earthquake non-structural damage. By integrating the AB2E2S model into the automated layout generation tool, architects and designers can obtain optimized architectural layouts that prioritize post-earthquake evacuation performance. Through the use of the tool, architects and designers can explore various design alternatives, considering different minimum room requirements, corridor widths, and exit lengths. This approach ensures that evacuation considerations are embedded in the early stages of the design process. In conclusion, this research presents an innovative automated internal architectural layout generation tool that integrates post-earthquake evacuation simulation. By incorporating evacuation considerations into the early-stage design process, architects and designers can optimize building layouts for improved post-earthquake evacuation performance. This tool empowers professionals to create resilient designs that prioritize the safety of building occupants in the face of seismic events.

Keywords: agent-based simulation, automation in design, architectural layout, post-earthquake evacuation behavior

Procedia PDF Downloads 85
9516 Research on Energy Field Intervening in Lost Space Renewal Strategy

Authors: Tianyue Wan

Abstract:

Lost space is the space that has not been used for a long time and is in decline, proposed by Roger Trancik. And in his book Finding Lost Space: Theories of Urban Design, the concept of lost space is defined as those anti-traditional spaces that are unpleasant, need to be redesigned, and have no benefit to the environment and users. They have no defined boundaries and do not connect the various landscape elements in a coherent way. With the rapid development of urbanization in China, the blind areas of urban renewal have become a chaotic lost space that is incompatible with the rapid development of urbanization. Therefore, lost space needs to be reconstructed urgently under the background of infill development and reduction planning in China. The formation of lost space is also an invisible division of social hierarchy. This paper tries to break down the social class division and the estrangement between people through the regeneration of lost space. Ultimately, it will enhance vitality, rebuild a sense of belonging, and create a continuous open public space for local people. Based on the concept of lost space and energy field, this paper clarifies the significance of the energy field in the lost space renovation. Then it introduces the energy field into lost space by using the magnetic field in physics as a prototype. The construction of the energy field is support by space theory, spatial morphology analysis theory, public communication theory, urban diversity theory and city image theory. Taking Wuhan’s Lingjiao Park of China as an example, this paper chooses the lost space on the west side of the park as the research object. According to the current situation of this site, the energy intervention strategies are proposed from four aspects: natural ecology, space rights, intangible cultural heritage and infrastructure configuration. And six specific lost space renewal methods are used in this work, including “riveting”, “breakthrough”, “radiation”, “inheritance”, “connection” and “intersection”. After the renovation, space will be re-introduced into the active crow. The integration of activities and space creates a sense of place, improve the walking experience, restores the vitality of the space, and provides a reference for the reconstruction of lost space in the city.

Keywords: dynamic vitality intervention, lost space, space vitality, sense of place

Procedia PDF Downloads 93
9515 Antioxidant Effects of C-Phycocyanin on Oxidized Astrocyte in Brain Injury Using 2D and 3D Neural Nanofiber Tissue Model

Authors: Seung Ju Yeon, Seul Ki Min, Jun Sang Park, Yeo Seon Kwon, Hoo Cheol Lee, Hyun Jung Shim, Il-Doo Kim, Ja Kyeong Lee, Hwa Sung Shin

Abstract:

In brain injury, depleting oxidative stress is the most effective way to reduce the brain infarct size. C-phycocyanin (C-Pc) is a well-known antioxidant protein that has neuroprotective effects obtained from green microalgae. Astrocyte is glial cell that supports the nerve cell such as neuron, which account for a large portion of the brain. In brain injury, such as ischemia and reperfusion, astrocyte has an important rule that overcomes the oxidative stress and protect from brain reactive oxygen species (ROS) injury. However little is known about how C-Pc regulates the anti-oxidants effects of astrocyte. In this study, when the C-Pc was treated in oxidized astrocyte, we confirmed that inflammatory factors Interleukin-6 and Interleukin-3 were increased and antioxidants enzyme, Superoxide dismutase (SOD) and catalase was upregulated, and neurotrophic factors, brain-derived neurotrophic factor (BDNF) and nerve growth factor (NGF) was alleviated. Also, it was confirmed to reduce infarct size of the brain in ischemia and reperfusion because C-Pc has anti-oxidant effects in middle cerebral artery occlusion (MCAO) animal model. These results show that C-Pc can help astrocytes lead neuroprotective activities in the oxidative stressed environment of the brain. In summary, the C-PC protects astrocytes from oxidative stress and has anti-oxidative, anti-inflammatory, neurotrophic effects under ischemic situations.

Keywords: c-phycocyanin, astrocyte, reactive oxygen species, ischemia and reperfusion, neuroprotective effect

Procedia PDF Downloads 302
9514 The System Dynamics Research of China-Africa Trade, Investment and Economic Growth

Authors: Emma Serwaa Obobisaa, Haibo Chen

Abstract:

International trade and outward foreign direct investment are important factors which are generally recognized in the economic growth and development. Though several scholars have struggled to reveal the influence of trade and outward foreign direct investment (FDI) on economic growth, most studies utilized common econometric models such as vector autoregression and aggregated the variables, which for the most part prompts, however, contradictory and mixed results. Thus, there is an exigent need for the precise study of the trade and FDI effect of economic growth while applying strong econometric models and disaggregating the variables into its separate individual variables to explicate their respective effects on economic growth. This will guarantee the provision of policies and strategies that are geared towards individual variables to ensure sustainable development and growth. This study, therefore, seeks to examine the causal effect of China-Africa trade and Outward Foreign Direct Investment on the economic growth of Africa using a robust and recent econometric approach such as system dynamics model. Our study impanels and tests an ensemble of a group of vital variables predominant in recent studies on trade-FDI-economic growth causality: Foreign direct ınvestment, international trade and economic growth. Our results showed that the system dynamics method provides accurate statistical inference regarding the direction of the causality among the variables than the conventional method such as OLS and Granger Causality predominantly used in the literature as it is more robust and provides accurate, critical values.

Keywords: economic growth, outward foreign direct investment, system dynamics model, international trade

Procedia PDF Downloads 93
9513 Convertible Lease, Risky Debt and Financial Structure with Growth Option

Authors: Ons Triki, Fathi Abid

Abstract:

The basic objective of this paper is twofold. It resides in designing a model for a contingent convertible lease contract that can ensure the financial stability of a company and recover the losses of the parties to the lease in the event of default. It also aims to compare the convertible lease contract on inefficiencies resulting from the debt-overhang problem and asset substitution with other financing policies. From this perspective, this paper highlights the interaction between investments and financing policies in a dynamic model with existing assets and a growth option where the investment cost is financed by a contingent convertible lease and equity. We explore the impact of the contingent convertible lease on the capital structure. We also check the reliability and effectiveness of the use of the convertible lease contract as a means of financing. Findings show that the rental convertible contract with a sufficiently high conversion ratio has less severe inefficiencies arising from risk-shifting and debt overhang than those entailed by risky debt and pure-equity financing. The problem of underinvestment pointed out by Mauer and Ott (2000) and the problem of overinvestment mentioned by Hackbarth and Mauer (2012) may be reduced under contingent convertible lease financing. Our findings predict that the firm value under contingent convertible lease financing increases globally with asset volatility instead of decreasing with business risk. The study reveals that convertible leasing contracts can stand for a reliable solution to ensure the lessee and quickly recover the counterparties of the lease upon default.

Keywords: contingent convertible lease, growth option, debt overhang, risk-shifting, capital structure

Procedia PDF Downloads 57
9512 Rethinking Urban Informality through the Lens of Inclusive Planning and Governance in Contemporary Cities: A Case Study of Johannesburg, South Africa

Authors: Blessings Masuku

Abstract:

Background: Considering that Africa is urbanizing faster than any other region globally, managing cities in the global South has become the centerpiece for the New Urban Agenda (i.e., a shared vision of how we rethink, rebuild, and manage our cities for a better and more sustainable future). This study is centered on governance and planning of urban informality practices with particular reference to the relationship between the state, informal actors (e.g., informal traders and informal dwellers), and other city stakeholders who are public space users (commuters, businesses, and environmental activists), and how informal actors organize themselves to lobby the state and claim for their rights in the city, and how they navigate their everyday livelihood strategies. Aim: The purpose of this study is to examine and interrogate contemporary approaches, policy and regulatory frameworks to urban spatial planning and management of informality in one of South Africa’s busiest and major cities, Johannesburg. Setting: The study uses the metropolitan region of the city of Johannesburg, South Africa to understand how this contemporary industrial city manages urban informality practices, including the use of public space, land zoning and street life, and paying a closer look at what progress has been made and gaps in their inclusive urban policy frameworks. Methods: This study utilized a qualitative approach that includes surveys (open-ended questions), archival research (i., e policy and other key document reviews), and key interviews mainly with city officials, and informality actors. A thematic analysis was used to analyze the data collected. Contribution: This study contributes to large urban informality scholarship in the global South cities by exploring how major cities particularly in Africa regulate and manage informality patterns and practices in their quest to build “utopian” smart cities. This study also brings a different perspective on the hacking ways used by the informal actors to resist harsh regulations and remain invisible in the city, which is something that previous literature has barely delved in-depth.

Keywords: inclusive planning and governance, infrastructure systems, livelihood strategies urban informality, urban space

Procedia PDF Downloads 57
9511 Design Optimization of Miniature Mechanical Drive Systems Using Tolerance Analysis Approach

Authors: Eric Mxolisi Mkhondo

Abstract:

Geometrical deviations and interaction of mechanical parts influences the performance of miniature systems.These deviations tend to cause costly problems during assembly due to imperfections of components, which are invisible to a naked eye.They also tend to cause unsatisfactory performance during operation due to deformation cause by environmental conditions.One of the effective tools to manage the deviations and interaction of parts in the system is tolerance analysis.This is a quantitative tool for predicting the tolerance variations which are defined during the design process.Traditional tolerance analysis assumes that the assembly is static and the deviations come from the manufacturing discrepancies, overlooking the functionality of the whole system and deformation of parts due to effect of environmental conditions. This paper presents an integrated tolerance analysis approach for miniature system in operation.In this approach, a computer-aided design (CAD) model is developed from system’s specification.The CAD model is then used to specify the geometrical and dimensional tolerance limits (upper and lower limits) that vary component’s geometries and sizes while conforming to functional requirements.Worst-case tolerances are analyzed to determine the influenced of dimensional changes due to effects of operating temperatures.The method is used to evaluate the nominal conditions, and worse case conditions in maximum and minimum dimensions of assembled components.These three conditions will be evaluated under specific operating temperatures (-40°C,-18°C, 4°C, 26°C, 48°C, and 70°C). A case study on the mechanism of a zoom lens system is used to illustrate the effectiveness of the methodology.

Keywords: geometric dimensioning, tolerance analysis, worst-case analysis, zoom lens mechanism

Procedia PDF Downloads 154
9510 Molecular Detection of Leishmania from the Phlebotomus Genus: Tendency towards Leishmaniasis Regression in Constantine, North-East of Algeria

Authors: K. Frahtia, I. Mihoubi, S. Picot

Abstract:

Leishmaniasis is a group of parasitic disease with a varied clinical expression caused by flagellate protozoa of the Leishmania genus. These diseases are transmitted to humans and animals by the sting of a vector insect, the female sandfly. Among the groups of dipteral disease vectors, Phlebotominae occupy a prime position and play a significant role in human pathology, such as leishmaniasis that affects nearly 350 million people worldwide. The vector control operation launched by health services throughout the country proves to be effective since despite the prevalence of the disease remains high especially in rural areas, leishmaniasis appears to be declining in Algeria. In this context, this study mainly concerns molecular detection of Leishmania from the vector. Furthermore, a molecular diagnosis has also been made on skin samples taken from patients in the region of Constantine, located in the North-East of Algeria. Concerning the vector, 5858 sandflies were captured, including 4360 males and 1498 females. Male specimens were identified based on their morphological. The morphological identification highlighted the presence of the Phlebotomus genus with a prevalence of 93% against 7% represented by the Sergentomyia genus. About the identified species, P. perniciosus is the most abundant with 59.4% of the male identified population followed by P. longicuspis with 24.7% of the workforce. P. perfiliewi is poorly represented by 6.7% of specimens followed by P. papatasi with 2.2% and 1.5% S. dreyfussi. Concerning skin samples, 45/79 (56.96%) collected samples were found positive by real-time PCR. This rate appears to be in sharp decline compared to previous years (alert peak of 30,227 cases in 2005). Concerning the detection of Leishmania from sandflies by RT-PCR, the results show that 3/60 PCR performed genus are positive with melting temperatures corresponding to that of the reference strain (84.1 +/- 0.4 ° C for L. infantum). This proves that the vectors were parasitized. On the other side, identification by RT-PCR species did not give any results. This could be explained by the presence of an insufficient amount of leishmanian DNA in the vector, and therefore support the hypothesis of the regression of leishmaniasis in Constantine.

Keywords: Algeria, molecular diagnostic, phlebotomus, real time PCR

Procedia PDF Downloads 259
9509 Biotechnological Recycling of Apple By-Products: A Reservoir Model to Produce a Dietary Supplement Fortified with Biogenic Phenolic Compounds

Authors: Ali Zein Aalabiden Tlais, Alessio Da Ros, Pasquale Filannino, Olimpia Vincentini, Marco Gobbetti, Raffaella Di Cagno

Abstract:

This study is an example of apple by-products (AP) recycling through a designed fermentation by selected autochthonous Lactobacillus plantarum AFI5 and Lactobacillus fabifermentans ALI6 used singly or as binary cultures with the selected Saccharomyces cerevisiae AYI7. Compared to Raw-, Unstarted- and Chemically Acidified-AP, Fermented-AP promoted the highest levels of total and insoluble dietary fibers, antioxidant activity, and free phenolics. The binary culture of L. plantarum AFI5 and S. cerevisiae AYI7 had the best effect on the bioavailability phenolic compounds as resulted by the Liquid chromatography-mass spectrometry validated method. The accumulation of phenolic acid derivatives highlighted microbial metabolism during AP fermentation. Bio-converted phenolic compounds were likely responsible for the increased antioxidant activity. The potential health-promoting effects of Fermented-AP were highlighted using Caco-2 cells. With variations among single and binary cultures, fermented-AP counteracted the inflammatory processes and the effects of oxidative stress in Caco-2 cells and preserved the integrity of tight junctions. An alternative and suitable model for food by-products recycling to manufacture a dietary supplement fortified with biogenic compounds was proposed. Highlighting the microbial metabolism of several phenolic compounds, undoubted additional value to such downstream wastes was created.

Keywords: apple by-products, antioxidant, fermentation, phenolic compounds

Procedia PDF Downloads 123
9508 A Resource-Based Perspective on Job Crafting Consequences: An Empirical Study from China

Authors: Eko Liao, Cheryl Zhang

Abstract:

Employee job crafting refers to employee’s proactive behaviors of making customized changes to their jobs on cognitive, relationship, and task levels. Previous studies have investigated different situations triggering employee’s job crafting. However, much less is known about what would be the consequences for both employee themselves and their work groups. Guided by conservation of resources theory (COR), this study investigates how employees job crafting increases their objective task performance and promotive voice behaviors at work. It is argued that employee would gain more resources when they actively craft their job tasks, which in turn increase their job performance and encourage them to have more constructive speak-up behaviors. Specifically, employee’s psychological resources (i.e., job engagement) and relational resources (i.e., leader-member relationships) would be enhanced from effective crafting behaviors, because employees are more likely to regard their job tasks as meaningful, and their leaders would be more likely to notice and recognize their dedication at work when employees craft their job frequently. To test this research model, around 400 employees from various Chinese organizations from mainland China joins the two-wave data collection stage. Employee’s job crafting behaviors in three aspects are measured at time 1. Perception of resource gain (job engagement and leader-member exchange), voice, and job performance are measured at time 2. The research model is generally supported. This study contributes to the job crafting literature by broadening the theoretical lens to a resource-based perspective. It also has practical implications that organizations should pay more attention to employee crafting behaviors because they are closely related to employees in-role performance and constructive voice behaviors.

Keywords: job crafting, resource-based perspective, voice, job performance

Procedia PDF Downloads 153
9507 Government Final Consumption Expenditure Financial Deepening and Household Consumption Expenditure NPISHs in Nigeria

Authors: Usman A. Usman

Abstract:

Undeniably, unlike the Classical side, the Keynesian perspective of the aggregate demand side indeed has a significant position in the policy, growth, and welfare of Nigeria due to government involvement and ineffective demand of the population living with poor per capita income. This study seeks to investigate the effect of Government Final Consumption Expenditure, Financial Deepening on Households, and NPISHs Final consumption expenditure using data on Nigeria from 1981 to 2019. This study employed the ADF stationarity test, Johansen Cointegration test, and Vector Error Correction Model. The results of the study revealed that the coefficient of Government final consumption expenditure has a positive effect on household consumption expenditure in the long run. There is a long-run and short-run relationship between gross fixed capital formation and household consumption expenditure. The coefficients cpsgdp financial deepening and gross fixed capital formation posit a negative impact on household final consumption expenditure. The coefficients money supply lm2gdp, which is another proxy for financial deepening, and the coefficient FDI have a positive effect on household final consumption expenditure in the long run. Therefore, this study recommends that Gross fixed capital formation stimulates household consumption expenditure; a legal framework to support investment is a panacea to increasing hoodmold income and consumption and reducing poverty in Nigeria. Therefore, this should be a key central component of policy.

Keywords: household, government expenditures, vector error correction model, johansen test

Procedia PDF Downloads 45