Search results for: Latent Dirichlet Allocation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1009

Search results for: Latent Dirichlet Allocation

499 Future Considerations for Wounded Service Members and Veterans of the Global War on Terror

Authors: Selina Doncevic, Lisa Perla, Angela Kindvall

Abstract:

The Global War on Terror which began after September 11, 2011, increased survivability of severe injuries requiring varying trajectories of rehabilitation and recovery. The costs encompass physiologic, functional, social, emotional, psychological, vocational and scholastic domains of life. The purpose of this poster is to inform private sector health care practitioners and clinicians at various levels of the unique and long term dynamics of healthcare recovery for polytrauma, and traumatic brain injured service members and veterans in the United States of America. Challenges include care delivery between the private sector, the department of defense, and veterans affairs healthcare systems while simultaneously supporting the dynamics of acute as well as latent complications associated with severe injury and illness. Clinical relevance, subtleties of protracted recovery, and overwhelmed systems of care are discussed in the context of lessons learned and in reflection on previous wars. Additional concerns for consideration and discussion include: the cost of protracted healthcare, various U.S. healthcare payer systems, lingering community reintegration challenges, ongoing care giver support, the rise of veterans support groups and the development of private sector clinical partnerships.

Keywords: brain injury, future, polytrauma, rehabilitation

Procedia PDF Downloads 178
498 The Employees' Classification Method in the Space of Their Job Satisfaction, Loyalty and Involvement

Authors: Svetlana Ignatjeva, Jelena Slesareva

Abstract:

The aim of the study is development and adaptation of the method to analyze and quantify the indicators characterizing the relationship between a company and its employees. Diagnostics of such indicators is one of the most complex and actual issues in psychology of labour. The offered method is based on the questionnaire; its indicators reflect cognitive, affective and connotative components of socio-psychological attitude of employees to be as efficient as possible in their professional activities. This approach allows measure not only the selected factors but also such parameters as cognitive and behavioural dissonances. Adaptation of the questionnaire includes factor structure analysis and suitability analysis of phenomena indicators measured in terms of internal consistency of individual factors. Structural validity of the questionnaire was tested by exploratory factor analysis. Extraction Method: Principal Component Analysis. Rotation Method: Varimax with Kaiser Normalization. Factor analysis allows reduce dimension of the phenomena moving from the indicators to aggregative indexes and latent variables. Aggregative indexes are obtained as the sum of relevant indicators followed by standardization. The coefficient Cronbach's Alpha was used to assess the reliability-consistency of the questionnaire items. The two-step cluster analysis in the space of allocated factors allows classify employees according to their attitude to work in the company. The results of psychometric testing indicate possibility of using the developed technique for the analysis of employees’ attitude towards their work in companies and development of recommendations on their optimization.

Keywords: involved in the organization, loyalty, organizations, method

Procedia PDF Downloads 335
497 Design of Collection and Transportation System of Municipal Solid Waste in Meshkinshahr City

Authors: Ebrahim Fataei, Seyed Ali Hosseini, Zahra Arabi, Habib farhadi, Mehdi Aalipour Erdi, Seiied Taghi Seiied Safavian

Abstract:

Solid waste production is an integral part of human life and management of waste require full scientific approach and essential planning. The allocation of most management cost to collection and transportation and also the necessity of operational efficiency in this system, by limiting time consumption, and on the other hand optimum collection system and transportation is the base of waste design and management. This study was done to optimize the exits collection and transportation system of solid waste in Meshkinshahr city. So based on the analyzed data of municipal solid waste components in seven zones of Meshkinshahr city, and GIS software, applied to design storage place based on origin recycling and a route to collect and transport. It was attempted to represent an appropriate model to store, collect and transport municipal solid waste. The result shows that GIS can be applied to locate the waste container and determine a waste collection direction in an appropriate way.

Keywords: municipal solid waste management, transportation, optimizing, GIS, Iran

Procedia PDF Downloads 508
496 An Integrated Web-Based Workflow System for Design of Computational Pipelines in the Cloud

Authors: Shuen-Tai Wang, Yu-Ching Lin

Abstract:

With more and more workflow systems adopting cloud as their execution environment, it presents various challenges that need to be addressed in order to be utilized efficiently. This paper introduces a method for resource provisioning based on our previous research of dynamic allocation and its pipeline processes. We present an abstraction for workload scheduling in which independent tasks get scheduled among various available processors of distributed computing for optimization. We also propose an integrated web-based workflow designer by taking advantage of the HTML5 technology and chaining together multiple tools. In order to make the combination of multiple pipelines executing on the cloud in parallel, we develop a script translator and an execution engine for workflow management in the cloud. All information is known in advance by the workflow engine and tasks are allocated according to the prior knowledge in the repository. This proposed effort has the potential to provide support for process definition, workflow enactment and monitoring of workflow processes. Users would benefit from the web-based system that allows creation and execution of pipelines without scripting knowledge.

Keywords: workflow systems, resources provisioning, workload scheduling, web-based, workflow engine

Procedia PDF Downloads 134
495 Comparison of the Positive and Indeterminate Rates of QuantiFERON-TB Gold In-Tube and T-SPOT. TB According to Age-group

Authors: Kina Kim

Abstract:

Background: There are two types of interferon-gamma release assays (IGRAs) in use for the detection of latent tuberculosis infection (LTBI), QuantiFERON-TB Gold In-tube (QFT-GIT) and T-SPOT.TB. There are some reports that IGRA results are affected by the patient's age. This study aims to compare the results of both IGRA tests according to age groups. Methods: We reviewed 54,882 samples referred to an independent reference laboratory (Seegene Medical Foundation, Seoul, Korea) for the diagnosis of LTBI from January 1, 2021, to December 31, 2021. This retrospective study enrolled 955 patients tested using QFT-GIT and 53,927 patients tested using T-SPOT.TB. The results of both IGRAs were divided in three age groups (0-9, 10-17, and ≥18-year old). The positive rates and the indeterminate rates between QFT-GIT and T-SPOT.TB were compared. We also evaluated the differences in positive and indeterminate rates by age-group. Results: The positive rate of QFT-GIT was 20.1% (192/955) and that of T-SPOT.TB was 8.7% (4704/53927) in overall patients. The positive rates of QFT-GIT in individuals aged 0-9, 10-17, and over 18-year old were 15.4%, 13.3%, and 22.0%, respectively. The positive rates of T-SPOT.TB were 8.9%, 2.0% and 8.8%,in each agegroup, respectively.The overall prevalence of indeterminate results was 2.1% (20/955) of QFT-GIT and 0.5% (270/53927) of T-SPOT.TB. The indeterminate rates of QFT-GIT in individuals aged 0-9, 10-17, and over 18 years were 0.4%, 6.7%, and 2.6%, respectively. The indeterminate rate of T-SPOT.TB were 0.5%, 0.7% and 0.5%,in each age group, respectively. Conclusion: Our findings suggest that T-SPOT.TB has a lower rate of positive results in overall patients and a lower rate of indeterminate results than those of QFT-GIT. The highest positive rate was found in the over 18 years group for QFT-GIT, but the positive rates of T-SPOT.TB was not significantly different among groups by age. QFT-GIT showed variable and higher indeterminate rates according to age group, but T-SPOT.TB showed lower rates in all age groups(<1%).

Keywords: LTBI, IGRA, QFT-GIT, T-SPOT. TB

Procedia PDF Downloads 95
494 Optimal Allocation of PHEV Parking Lots to Minimize Dstribution System Losses

Authors: Mohsen Mazidi, Ali Abbaspour, Mahmud Fotuhi-Firuzabad, Mohamamd Rastegar

Abstract:

To tackle the air pollution issues, Plug-in Hybrid Electric Vehicles (PHEVs) are proposed as an appropriate solution. Charging a large amount of PHEV batteries, if not controlled, would have negative impacts on the distribution system. The control process of charging of these vehicles can be centralized in parking lots that may provide a chance for better coordination than the individual charging in houses. In this paper, an optimization-based approach is proposed to determine the optimum PHEV parking capacities in candidate nodes of the distribution system. In so doing, a profile for charging and discharging of PHEVs is developed in order to flatten the network load profile. Then, this profile is used in solving an optimization problem to minimize the distribution system losses. The outputs of the proposed method are the proper place for PHEV parking lots and optimum capacity for each parking. The application of the proposed method on the IEEE-34 node test feeder verifies the effectiveness of the method.

Keywords: loss, plug-in hybrid electric vehicle (PHEV), PHEV parking lot, V2G

Procedia PDF Downloads 519
493 Study of Parameters Influencing Dwell Times for Trains

Authors: Guillaume Craveur

Abstract:

The work presented here shows a study on several parameters identified as influencing dwell times for trains. Three kinds of rolling stocks are studied for this project and the parameters presented are the number of passengers, the allocation of passengers, their priorities, the platform station height, the door width and the train design. In order to make this study, a lot of records have been done in several stations in Paris (France). Then, in order to study these parameters, numerical simulations are completed. The goal is to quantify the impact of each parameter on the dwelling times. For example, this study highlights the impact of platform height and the presence of steps between the platform and the train. Three types of station platforms are concerned by this study : ‘optimum’ station platform which is 920 mm high, standard station platform which is 550 mm high, and high station platform which is 1150 mm high and different kinds of steps exist in order to fill these gaps. To conclude, this study shows the impact of these parameters on dwell times and their impact in function of the size of population.

Keywords: dwell times, numerical tools, rolling stock, platforms

Procedia PDF Downloads 317
492 Artificial Intelligence for Cloud Computing

Authors: Sandesh Achar

Abstract:

Artificial intelligence is being increasingly incorporated into many applications across various sectors such as health, education, security, and agriculture. Recently, there has been rapid development in cloud computing technology, resulting in AI’s implementation into cloud computing to enhance and optimize the technology service rendered. The deployment of AI in cloud-based applications has brought about autonomous computing, whereby systems achieve stated results without human intervention. Despite the amount of research into autonomous computing, work incorporating AI/ML into cloud computing to enhance its performance and resource allocation remain a fundamental challenge. This paper highlights different manifestations, roles, trends, and challenges related to AI-based cloud computing models. This work reviews and highlights excellent investigations and progress in the domain. Future directions are suggested for leveraging AI/ML in next-generation computing for emerging computing paradigms such as cloud environments. Adopting AI-based algorithms and techniques to increase operational efficiency, cost savings, automation, reducing energy consumption and solving complex cloud computing issues are the major findings outlined in this paper.

Keywords: artificial intelligence, cloud computing, deep learning, machine learning, internet of things

Procedia PDF Downloads 82
491 Teachers’ Stress as a Moderator of the Impact of POMPedaSens on Preschool Children’s Social-Emotional Learning

Authors: Maryam Zarra-Nezhad, Ali Moazami-Goodarzi, Joona Muotka, Nina Sajaniemi

Abstract:

This study examines the extent to which the impact of a universal intervention program, i.e., POMPedaSens, on children’s early social-emotional learning (SEL) is different depending on early childhood education (ECE) teaches stress at work. The POMPedaSens program aims to promote children’s (5–6-year-olds) SEL by supporting ECE teachers’ engagement and emotional availability. The intervention effectiveness has been monitored using an 8-month randomized controlled trial design with an intervention (IG; 26 teachers and 195 children) and a waiting control group (CG; 36 teachers and 198 children) that provided the data before and after the program implementation. The ECE teachers in the IG are trained to implement the intervention program in their early childhood education and care groups. Latent change score analysis suggests that the program increases children’s prosocial behavior in the IG when teachers show a low level of stress. No significant results were found for the IG regarding a change in antisocial behavior. However, when teachers showed a high level of stress, an increase in prosocial behavior and a decrease in antisocial behavior were only found for children in the CG. The results suggest a promising application of the POMPedaSens program for promoting prosocial behavior in early childhood when teachers have low stress. The intervention will likely need a longer time to display the moderating effect of ECE teachers’ well-being on children’s antisocial behavior change.

Keywords: early childhood, social-emotional learning, universal intervention program, professional development, teachers' stress

Procedia PDF Downloads 72
490 Reflection on the Resilience Construction of Megacities Under the Background of Territorial Space Governance

Authors: Xin Jie Li

Abstract:

Due to population agglomeration, huge scale, and complex activities, megacities have become risk centers. To resist the risks brought by development uncertainty, the construction of resilient cities has become a common strategic choice for megacities. As a key link in promoting the modernization of the national governance system and governance capacity, optimizing the layout of national land space that focuses on ecology, production, and life and improving the rationality of spatial resource allocation are conducive to fundamentally promoting the resilience construction of megacities. Therefore, based on the perspective of territorial space governance, this article explores the potential risks faced by the territorial space of megacities and proposes possible paths for the resilience construction of megacities from four aspects: promoting the construction of a resilience system throughout the entire life cycle, constructing a disaster prevention and control system with ecological resilience, creating an industrial spatial pattern with production resilience, and enhancing community resilience to anchor the front line of risk response in megacities.

Keywords: mega cities, potential risks, resilient city construction, territorial and spatial governance

Procedia PDF Downloads 22
489 An Evolutionary Multi-Objective Optimization for Airport Gate Assignment Problem

Authors: Seyedmirsajad Mokhtarimousavi, Danial Talebi, Hamidreza Asgari

Abstract:

Gate Assignment Problem (GAP) is one of the most substantial issues in airport operation. In principle, GAP intends to maintain the maximum capacity of the airport through the best possible allocation of the resources (gates) in order to reach the optimum outcome. The problem involves a wide range of dependent and independent resources and their limitations, which add to the complexity of GAP from both theoretical and practical perspective. In this study, GAP was mathematically formulated as a three-objective problem. The preliminary goal of multi-objective formulation was to address a higher number of objectives that can be simultaneously optimized and therefore increase the practical efficiency of the final solution. The problem is solved by applying the second version of Non-dominated Sorting Genetic Algorithm (NSGA-II). Results showed that the proposed mathematical model could address most of major criteria in the decision-making process in airport management in terms of minimizing both airport/airline cost and passenger walking distance time. Moreover, the proposed approach could properly find acceptable possible answers.

Keywords: airport management, gate assignment problem, mathematical modeling, genetic algorithm, NSGA-II

Procedia PDF Downloads 275
488 Deep Feature Augmentation with Generative Adversarial Networks for Class Imbalance Learning in Medical Images

Authors: Rongbo Shen, Jianhua Yao, Kezhou Yan, Kuan Tian, Cheng Jiang, Ke Zhou

Abstract:

This study proposes a generative adversarial networks (GAN) framework to perform synthetic sampling in feature space, i.e., feature augmentation, to address the class imbalance problem in medical image analysis. A feature extraction network is first trained to convert images into feature space. Then the GAN framework incorporates adversarial learning to train a feature generator for the minority class through playing a minimax game with a discriminator. The feature generator then generates features for minority class from arbitrary latent distributions to balance the data between the majority class and the minority class. Additionally, a data cleaning technique, i.e., Tomek link, is employed to clean up undesirable conflicting features introduced from the feature augmentation and thus establish well-defined class clusters for the training. The experiment section evaluates the proposed method on two medical image analysis tasks, i.e., mass classification on mammogram and cancer metastasis classification on histopathological images. Experimental results suggest that the proposed method obtains superior or comparable performance over the state-of-the-art counterparts. Compared to all counterparts, our proposed method improves more than 1.5 percentage of accuracy.

Keywords: class imbalance, synthetic sampling, feature augmentation, generative adversarial networks, data cleaning

Procedia PDF Downloads 107
487 Hierarchical Queue-Based Task Scheduling with CloudSim

Authors: Wanqing You, Kai Qian, Ying Qian

Abstract:

The concepts of Cloud Computing provide users with infrastructure, platform and software as service, which make those services more accessible for people via Internet. To better analysis the performance of Cloud Computing provisioning policies as well as resources allocation strategies, a toolkit named CloudSim proposed. With CloudSim, the Cloud Computing environment can be easily constructed by modelling and simulating cloud computing components, such as datacenter, host, and virtual machine. A good scheduling strategy is the key to achieve the load balancing among different machines as well as to improve the utilization of basic resources. Recently, the existing scheduling algorithms may work well in some presumptive cases in a single machine; however they are unable to make the best decision for the unforeseen future. In real world scenario, there would be numbers of tasks as well as several virtual machines working in parallel. Based on the concepts of multi-queue, this paper presents a new scheduling algorithm to schedule tasks with CloudSim by taking into account several parameters, the machines’ capacity, the priority of tasks and the history log.

Keywords: hierarchical queue, load balancing, CloudSim, information technology

Procedia PDF Downloads 401
486 Fiscal Size and Composition Effects on Growth: Empirical Evidence from Asian Economies

Authors: Jeeban Amgain

Abstract:

This paper investigates the impact of the size and composition of government expenditure and tax on GDP per capita growth in 36 Asian economies over the period of 1991-2012. The research employs the technique of panel regression; Fixed Effects and Generalized Method of Moments (GMM) as well as other statistical and descriptive approaches. The finding concludes that the size of government expenditure and tax revenue are generally low in this region. GDP per capita growth is strongly negative in response to Government expenditure, however, no significant relationship can be measured in case of size of taxation although it is positively correlated with economic growth. Panel regression of decomposed fiscal components also shows that the pattern of allocation of expenditure and taxation really matters on growth. Taxes on international trade and property have a significant positive impact on growth. In contrast, a major portion of expenditure, i.e. expenditure on general public services, health and education are found to have significant negative impact on growth, implying that government expenditures are not being productive in the Asian region for some reasons. Comparatively smaller and efficient government size would enhance the growth.

Keywords: government expenditure, tax, GDP per capita growth, composition

Procedia PDF Downloads 449
485 Resource Constrained Time-Cost Trade-Off Analysis in Construction Project Planning and Control

Authors: Sangwon Han, Chengquan Jin

Abstract:

Time-cost trade-off (TCTO) is one of the most significant part of construction project management. Despite the significance, current TCTO analysis, based on the Critical Path Method, does not consider resource constraint, and accordingly sometimes generates an impractical and/or infeasible schedule planning in terms of resource availability. Therefore, resource constraint needs to be considered when doing TCTO analysis. In this research, genetic algorithms (GA) based optimization model is created in order to find the optimal schedule. This model is utilized to compare four distinct scenarios (i.e., 1) initial CPM, 2) TCTO without considering resource constraint, 3) resource allocation after TCTO, and 4) TCTO with considering resource constraint) in terms of duration, cost, and resource utilization. The comparison results identify that ‘TCTO with considering resource constraint’ generates the optimal schedule with the respect of duration, cost, and resource. This verifies the need for consideration of resource constraint when doing TCTO analysis. It is expected that the proposed model will produce more feasible and optimal schedule.

Keywords: time-cost trade-off, genetic algorithms, critical path, resource availability

Procedia PDF Downloads 156
484 Integrated Vegetable Production Planning Considering Crop Rotation Rules Using a Mathematical Mixed Integer Programming Model

Authors: Mohammadali Abedini Sanigy, Jiangang Fei

Abstract:

In this paper, a mathematical optimization model was developed to maximize the profit in a vegetable production planning problem. It serves as a decision support system that assists farmers in land allocation to crops and harvest scheduling decisions. The developed model can handle different rotation rules in two consecutive cycles of production, which is a common practice in organic production system. Moreover, different production methods of the same crop were considered in the model formulation. The main strength of the model is that it is not restricted to predetermined production periods, which makes the planning more flexible. The model is classified as a mixed integer programming (MIP) model and formulated in PYOMO -a Python package to formulate optimization models- and solved via Gurobi and CPLEX optimizer packages. The model was tested with secondary data from 'Australian vegetable growing farms', and the results were obtained and discussed with the computational test runs. The results show that the model can successfully provide reliable solutions for real size problems.

Keywords: crop rotation, harvesting, mathematical model formulation, vegetable production

Procedia PDF Downloads 163
483 The Making of a Community: Perception versus Reality of Neighborhood Resources

Authors: Kirstie Smith

Abstract:

This paper elucidates the value of neighborhood perception as it contributes to the advancement of well-being for individuals and families within a neighborhood. Through in-depth interviews with city residents, this paper examines the degree to which key stakeholders’ (residents) evaluate their neighborhood and perception of resources and identify, access, and utilize local assets existing in the community. Additionally, the research objective included conducting a community inventory that qualified the community assets and resources of lower-income neighborhoods of a medium-sized industrial city. Analysis of the community’s assets was compared with the interview results to allow for a better understanding of the community’s condition. Community mapping revealed the key informants’ reflections of assets were somewhat validated. In each neighborhood, there were more assets mapped than reported in the interviews. Another chief supposition drawn from this study was the identification of key development partners and social networks that offer the potential to facilitate locally-driven community development. Overall, the participants provided invaluable local knowledge of the perception of neighborhood assets, the well-being of residents, the condition of the community, and suggestions for responding to the challenges of the entire community in order to mobilize the present assets and networks.

Keywords: community mapping, family, resource allocation, social networks

Procedia PDF Downloads 317
482 Developing a Machine Learning-based Cost Prediction Model for Construction Projects using Particle Swarm Optimization

Authors: Soheila Sadeghi

Abstract:

Accurate cost prediction is essential for effective project management and decision-making in the construction industry. This study aims to develop a cost prediction model for construction projects using Machine Learning techniques and Particle Swarm Optimization (PSO). The research utilizes a comprehensive dataset containing project cost estimates, actual costs, resource details, and project performance metrics from a road reconstruction project. The methodology involves data preprocessing, feature selection, and the development of an Artificial Neural Network (ANN) model optimized using PSO. The study investigates the impact of various input features, including cost estimates, resource allocation, and project progress, on the accuracy of cost predictions. The performance of the optimized ANN model is evaluated using metrics such as Mean Squared Error (MSE), Root Mean Squared Error (RMSE), Mean Absolute Error (MAE), and R-squared. The results demonstrate the effectiveness of the proposed approach in predicting project costs, outperforming traditional benchmark models. The feature selection process identifies the most influential variables contributing to cost variations, providing valuable insights for project managers. However, this study has several limitations. Firstly, the model's performance may be influenced by the quality and quantity of the dataset used. A larger and more diverse dataset covering different types of construction projects would enhance the model's generalizability. Secondly, the study focuses on a specific optimization technique (PSO) and a single Machine Learning algorithm (ANN). Exploring other optimization methods and comparing the performance of various ML algorithms could provide a more comprehensive understanding of the cost prediction problem. Future research should focus on several key areas. Firstly, expanding the dataset to include a wider range of construction projects, such as residential buildings, commercial complexes, and infrastructure projects, would improve the model's applicability. Secondly, investigating the integration of additional data sources, such as economic indicators, weather data, and supplier information, could enhance the predictive power of the model. Thirdly, exploring the potential of ensemble learning techniques, which combine multiple ML algorithms, may further improve cost prediction accuracy. Additionally, developing user-friendly interfaces and tools to facilitate the adoption of the proposed cost prediction model in real-world construction projects would be a valuable contribution to the industry. The findings of this study have significant implications for construction project management, enabling proactive cost estimation, resource allocation, budget planning, and risk assessment, ultimately leading to improved project performance and cost control. This research contributes to the advancement of cost prediction techniques in the construction industry and highlights the potential of Machine Learning and PSO in addressing this critical challenge. However, further research is needed to address the limitations and explore the identified future research directions to fully realize the potential of ML-based cost prediction models in the construction domain.

Keywords: cost prediction, construction projects, machine learning, artificial neural networks, particle swarm optimization, project management, feature selection, road reconstruction

Procedia PDF Downloads 23
481 A Gauge Repeatability and Reproducibility Study for Multivariate Measurement Systems

Authors: Jeh-Nan Pan, Chung-I Li

Abstract:

Measurement system analysis (MSA) plays an important role in helping organizations to improve their product quality. Generally speaking, the gauge repeatability and reproducibility (GRR) study is performed according to the MSA handbook stated in QS9000 standards. Usually, GRR study for assessing the adequacy of gauge variation needs to be conducted prior to the process capability analysis. Traditional MSA only considers a single quality characteristic. With the advent of modern technology, industrial products have become very sophisticated with more than one quality characteristic. Thus, it becomes necessary to perform multivariate GRR analysis for a measurement system when collecting data with multiple responses. In this paper, we take the correlation coefficients among tolerances into account to revise the multivariate precision-to-tolerance (P/T) ratio as proposed by Majeske (2008). We then compare the performance of our revised P/T ratio with that of the existing ratios. The simulation results show that our revised P/T ratio outperforms others in terms of robustness and proximity to the actual value. Moreover, the optimal allocation of several parameters such as the number of quality characteristics (v), sample size of parts (p), number of operators (o) and replicate measurements (r) is discussed using the confidence interval of the revised P/T ratio. Finally, a standard operating procedure (S.O.P.) to perform the GRR study for multivariate measurement systems is proposed based on the research results. Hopefully, it can be served as a useful reference for quality practitioners when conducting such study in industries. Measurement system analysis (MSA) plays an important role in helping organizations to improve their product quality. Generally speaking, the gauge repeatability and reproducibility (GRR) study is performed according to the MSA handbook stated in QS9000 standards. Usually, GRR study for assessing the adequacy of gauge variation needs to be conducted prior to the process capability analysis. Traditional MSA only considers a single quality characteristic. With the advent of modern technology, industrial products have become very sophisticated with more than one quality characteristic. Thus, it becomes necessary to perform multivariate GRR analysis for a measurement system when collecting data with multiple responses. In this paper, we take the correlation coefficients among tolerances into account to revise the multivariate precision-to-tolerance (P/T) ratio as proposed by Majeske (2008). We then compare the performance of our revised P/T ratio with that of the existing ratios. The simulation results show that our revised P/T ratio outperforms others in terms of robustness and proximity to the actual value. Moreover, the optimal allocation of several parameters such as the number of quality characteristics (v), sample size of parts (p), number of operators (o) and replicate measurements (r) is discussed using the confidence interval of the revised P/T ratio. Finally, a standard operating procedure (S.O.P.) to perform the GRR study for multivariate measurement systems is proposed based on the research results. Hopefully, it can be served as a useful reference for quality practitioners when conducting such study in industries.

Keywords: gauge repeatability and reproducibility, multivariate measurement system analysis, precision-to-tolerance ratio, Gauge repeatability

Procedia PDF Downloads 235
480 The Impact of Corporate Social Responsibility Information Disclosure on the Accuracy of Analysts' Earnings Forecasts

Authors: Xin-Hua Zhao

Abstract:

In recent years, the growth rate of social responsibility reports disclosed by Chinese corporations has grown rapidly. The economic effects of the growing corporate social responsibility reports have become a hot topic. The article takes the chemical listed engineering corporations that disclose social responsibility reports in China as a sample, and based on the information asymmetry theory, examines the economic effect generated by corporate social responsibility disclosure with the method of ordinary least squares. The research is conducted from the perspective of analysts’ earnings forecasts and studies the impact of corporate social responsibility information disclosure on improving the accuracy of analysts' earnings forecasts. The results show that there is a statistically significant negative correlation between corporate social responsibility disclosure index and analysts’ earnings forecast error. The conclusions confirm that enterprises can reduce the asymmetry of social and environmental information by disclosing social responsibility reports, and thus improve the accuracy of analysts’ earnings forecasts. It can promote the effective allocation of resources in the market.

Keywords: analysts' earnings forecasts, corporate social responsibility disclosure, economic effect, information asymmetry

Procedia PDF Downloads 135
479 Norms and Laws: Fate of Community Forestry in Jharkhand

Authors: Pawas Suren

Abstract:

The conflict between livelihood and forest protection has been a perpetual phenomenon in India. In the era of climate change, the problem is expected to aggravate the declining trend of dense forest in the country, creating impediments in the climate change adaptation by the forest dependent communities. In order to access the complexity of the problem, Hazarinagh and Chatra districts of Jharkhand were selected as a case study. To identify norms practiced by the communities to manage community forestry, the ethnographic study was designed to understand the values, traditions, and cultures of forest dependent communities, most of whom were tribal. It was observed that internalization of efficient forest norms is reflected in the pride and honor of such behavior while violators are sanctioned through guilt and shame. The study analyzes the effect of norms being practiced in the management and ecology of community forestry as common property resource. The light of the findings led towards the gaps in the prevalent forest laws to address efficient allocation of property rights. The conclusion embarks on reconsidering accepted factors of forest degradation in India.

Keywords: climate change, common property resource, community forestry, norms

Procedia PDF Downloads 325
478 Unsupervised Feature Learning by Pre-Route Simulation of Auto-Encoder Behavior Model

Authors: Youngjae Jin, Daeshik Kim

Abstract:

This paper describes a cycle accurate simulation results of weight values learned by an auto-encoder behavior model in terms of pre-route simulation. Given the results we visualized the first layer representations with natural images. Many common deep learning threads have focused on learning high-level abstraction of unlabeled raw data by unsupervised feature learning. However, in the process of handling such a huge amount of data, the learning method’s computation complexity and time limited advanced research. These limitations came from the fact these algorithms were computed by using only single core CPUs. For this reason, parallel-based hardware, FPGAs, was seen as a possible solution to overcome these limitations. We adopted and simulated the ready-made auto-encoder to design a behavior model in Verilog HDL before designing hardware. With the auto-encoder behavior model pre-route simulation, we obtained the cycle accurate results of the parameter of each hidden layer by using MODELSIM. The cycle accurate results are very important factor in designing a parallel-based digital hardware. Finally this paper shows an appropriate operation of behavior model based pre-route simulation. Moreover, we visualized learning latent representations of the first hidden layer with Kyoto natural image dataset.

Keywords: auto-encoder, behavior model simulation, digital hardware design, pre-route simulation, Unsupervised feature learning

Procedia PDF Downloads 423
477 Advocacy for Increasing Health Care Budget in Parepare City with DALY Approach: Case Study on Improving Public Health Insurance Budget

Authors: Kasman, Darmawansyah, Alimin Maidin, Amran Razak

Abstract:

Background: In decentralization, advocacy is needed to increase the health budget in Parepare District. One of the advocacy methods recommended by the World Bank is the economic loss approach. Methods: This research is observational in the field of health economics that contributes directly to the magnitude of the economic loss of the community and the government and provides advocacy to the executive and legislative to see the harm it causes. Results: The research results show the amount of direct cost, which consists of household expenditure for transport Rp.295,865,500. Indirect Cost of YLD of Rp.14.688.000, and YLL of Rp.28.986.336.00, so the amount of DALY is Rp.43.674.336.000. The total economic loss of Rp.43.970.201.500. These huge economic losses can be prevented by increasing the allocation of health budgets for promotive and preventive efforts and expanding the coverage of health insurance for the community. Conclusion: There is a need to advocate the executive and legislative about the importance of guarantee on public health financing by conducting studies in terms of economic losses so that all strategic alliances believe that health is an investment.

Keywords: advocacy, economic lost, health insurance, economic losses

Procedia PDF Downloads 91
476 Heat Transfer Enhancement of Structural Concretes Made of Macro-Encapsulated Phase Change Materials

Authors: Ehsan Mohseni, Waiching Tang, Shanyong Wang

Abstract:

Low thermal conductivity of phase change materials (PCMs) affects the thermal performance and energy storage efficiency of latent heat thermal energy storage systems. In the current research, a structural lightweight concrete with function of indoor temperature control was developed using thermal energy storage aggregates (TESA) and nano-titanium (NT). The macro-encapsulated technique was served to incorporate the PCM into the lightweight aggregate through vacuum impregnation. The compressive strength was measured, and the thermal performance of concrete panel was evaluated by using a self-designed environmental chamber. The impact of NT on microstructure was also assessed via scanning electron microscopy (SEM) and energy dispersive spectroscopy (EDS) tests. The test results indicated that NT was able to increase the compressive strength by filling the micro pores and making the microstructure denser and more homogeneous. In addition, the environmental chamber experiment showed that introduction of NT into TESA improved the heat transfer of composites noticeably. The changes were illustrated by the reduction in peak temperatures in the centre, outside and inside surfaces of concrete panels by the inclusion of NT. It can be concluded that NT particles had the capability to decrease the energy consumption and obtain higher energy storage efficiency by the reduction of indoor temperature.

Keywords: heat transfer, macro-encapsulation, microstructure properties, nanoparticles, phase change material

Procedia PDF Downloads 88
475 Numerical Simulation of Transient 3D Temperature and Kerf Formation in Laser Fusion Cutting

Authors: Karim Kheloufi, El Hachemi Amara

Abstract:

In the present study, a three-dimensional transient numerical model was developed to study the temperature field and cutting kerf shape during laser fusion cutting. The finite volume model has been constructed, based on the Navier–Stokes equations and energy conservation equation for the description of momentum and heat transport phenomena, and the Volume of Fluid (VOF) method for free surface tracking. The Fresnel absorption model is used to handle the absorption of the incident wave by the surface of the liquid metal and the enthalpy-porosity technique is employed to account for the latent heat during melting and solidification of the material. To model the physical phenomena occurring at the liquid film/gas interface, including momentum/heat transfer, a new approach is proposed which consists of treating friction force, pressure force applied by the gas jet and the heat absorbed by the cutting front surface as source terms incorporated into the governing equations. All these physics are coupled and solved simultaneously in Fluent CFD®. The main objective of using a transient phase change model in the current case is to simulate the dynamics and geometry of a growing laser-cutting generated kerf until it becomes fully developed. The model is used to investigate the effect of some process parameters on temperature fields and the formed kerf geometry.

Keywords: laser cutting, numerical simulation, heat transfer, fluid flow

Procedia PDF Downloads 311
474 Use of Improved Genetic Algorithm in Cloud Computing to Reduce Energy Consumption in Migration of Virtual Machines

Authors: Marziyeh Bahrami, Hamed Pahlevan Hsseini, Behnam Ghamami, Arman Alvanpour, Hamed Ezzati, Amir Salar Sadeghi

Abstract:

One of the ways to increase the efficiency of services in the system of agents and, of course, in the world of cloud computing, is to use virtualization techniques. The aim of this research is to create changes in cloud computing services that will reduce as much as possible the energy consumption related to the migration of virtual machines and, in some way, the energy related to the allocation of resources and reduce the amount of pollution. So far, several methods have been proposed to increase the efficiency of cloud computing services in order to save energy in the cloud environment. The method presented in this article tries to prevent energy consumption by data centers and the subsequent production of carbon and biological pollutants as much as possible by increasing the efficiency of cloud computing services. The results show that the proposed algorithm, using the improvement in virtualization techniques and with the help of a genetic algorithm, improves the efficiency of cloud services in the matter of migrating virtual machines and finally saves consumption. becomes energy.

Keywords: consumption reduction, cloud computing, genetic algorithm, live migration, virtual Machine

Procedia PDF Downloads 34
473 Dimensional Investigation of Food Addiction in Individuals Who Have Undergone Bariatric Surgery

Authors: Ligia Florio, João Mauricio Castaldelli-Maia

Abstract:

Background: Food addiction (FA) emerged in the 1990s as a possible contributor to the increasing prevalence of obesity and overweight, in conjunction with changing food environments and mental health conditions. However, FA is not yet listed as one of the disorders in the DSM-5 and/or the ICD-11. Although there are controversies and debates in the literature about the classification and construct of FA, the most common approach to access it is the use of a research tool - the Yale Food Addiction Scale (YFAS) - which approximates the concept of FA to the concept diagnosis of dependence on psychoactive substances. There is a need to explore the dimensional phenotypes accessed by YFAS in different population groups for a better understanding and scientific support of FA diagnoses. Methods: The primary objective of this project was to investigate the construct validity of the FA concept by mYFAS 2.0 in individuals who underwent bariatric surgery (n = 100) at the Hospital Estadual Mário Covas since 2011. Statistical analyzes were conducted using the STATA software. In this sense, structural or factor validity was the type of construct validity investigated using exploratory factor analysis (EFA) and item response theory (IRT) techniques. Results: EFA showed that the one-dimensional model was the most parsimonious. The IRT showed that all criteria contributed to the latent structure, presenting discrimination values greater than 0.5, with most presenting values greater than 2. Conclusion: This study reinforces a FA dimension in patients who underwent bariatric surgery. Within this dimension, we identified the most severe and discriminating criteria for the diagnosis of FA.

Keywords: obesity, food addiction, bariatric surgery, regain

Procedia PDF Downloads 54
472 A Two Level Load Balancing Approach for Cloud Environment

Authors: Anurag Jain, Rajneesh Kumar

Abstract:

Cloud computing is the outcome of rapid growth of internet. Due to elastic nature of cloud computing and unpredictable behavior of user, load balancing is the major issue in cloud computing paradigm. An efficient load balancing technique can improve the performance in terms of efficient resource utilization and higher customer satisfaction. Load balancing can be implemented through task scheduling, resource allocation and task migration. Various parameters to analyze the performance of load balancing approach are response time, cost, data processing time and throughput. This paper demonstrates a two level load balancer approach by combining join idle queue and join shortest queue approach. Authors have used cloud analyst simulator to test proposed two level load balancer approach. The results are analyzed and compared with the existing algorithms and as observed, proposed work is one step ahead of existing techniques.

Keywords: cloud analyst, cloud computing, join idle queue, join shortest queue, load balancing, task scheduling

Procedia PDF Downloads 405
471 Detection of Atrial Fibrillation Using Wearables via Attentional Two-Stream Heterogeneous Networks

Authors: Huawei Bai, Jianguo Yao, Fellow, IEEE

Abstract:

Atrial fibrillation (AF) is the most common form of heart arrhythmia and is closely associated with mortality and morbidity in heart failure, stroke, and coronary artery disease. The development of single spot optical sensors enables widespread photoplethysmography (PPG) screening, especially for AF, since it represents a more convenient and noninvasive approach. To our knowledge, most existing studies based on public and unbalanced datasets can barely handle the multiple noises sources in the real world and, also, lack interpretability. In this paper, we construct a large- scale PPG dataset using measurements collected from PPG wrist- watch devices worn by volunteers and propose an attention-based two-stream heterogeneous neural network (TSHNN). The first stream is a hybrid neural network consisting of a three-layer one-dimensional convolutional neural network (1D-CNN) and two-layer attention- based bidirectional long short-term memory (Bi-LSTM) network to learn representations from temporally sampled signals. The second stream extracts latent representations from the PPG time-frequency spectrogram using a five-layer CNN. The outputs from both streams are fed into a fusion layer for the outcome. Visualization of the attention weights learned demonstrates the effectiveness of the attention mechanism against noise. The experimental results show that the TSHNN outperforms all the competitive baseline approaches and with 98.09% accuracy, achieves state-of-the-art performance.

Keywords: PPG wearables, atrial fibrillation, feature fusion, attention mechanism, hyber network

Procedia PDF Downloads 93
470 Highway Capacity and Level of Service

Authors: Kidist Mesfin Nguse

Abstract:

Ethiopia is the second most densely populated nation in Africa, and about 121 million people as the 2022 Ethiopia population live report recorded. In recent years, the Ethiopian government (GOE) has been gradually growing its road network. With 138,127 kilometers (85,825 miles) of all-weather roads as of the end of 2018–19, Ethiopia possessed just 39% of the nation's necessary road network and lacked a well-organized system. The Ethiopian urban population report recorded that about 21% of the population lives in urban areas, and the high population, coupled with growth in various infrastructures, has led to the migration of the workforce from rural areas to cities across the country. In main roads, the heterogeneous traffic flow with various operational features makes it more unfavorable, causing frequent congestion in the stretch of road. The Level of Service (LOS), a qualitative measure of traffic, is categorized based on the operating conditions in the traffic stream. Determining the capacity and LOS for this city is very crucial as this affects the planning and design of traffic systems and their operation, and the allocation of route selection for infrastructure building projects to provide for a considerably good level of service.

Keywords: capacity, level of service, traffic volume, free flow speed

Procedia PDF Downloads 27