Search results for: Linear Discriminant Analysis (LDA)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 29961

Search results for: Linear Discriminant Analysis (LDA)

29151 Budgetary Performance Model for Managing Pavement Maintenance

Authors: Vivek Hokam, Vishrut Landge

Abstract:

An ideal maintenance program for an industrial road network is one that would maintain all sections at a sufficiently high level of functional and structural conditions. However, due to various constraints such as budget, manpower and equipment, it is not possible to carry out maintenance on all the needy industrial road sections within a given planning period. A rational and systematic priority scheme needs to be employed to select and schedule industrial road sections for maintenance. Priority analysis is a multi-criteria process that determines the best ranking list of sections for maintenance based on several factors. In priority setting, difficult decisions are required to be made for selection of sections for maintenance. It is more important to repair a section with poor functional conditions which includes uncomfortable ride etc. or poor structural conditions i.e. sections those are in danger of becoming structurally unsound. It would seem therefore that any rational priority setting approach must consider the relative importance of functional and structural condition of the section. The maintenance priority index and pavement performance models tend to focus mainly on the pavement condition, traffic criteria etc. There is a need to develop the model which is suitably used with respect to limited budget provisions for maintenance of pavement. Linear programming is one of the most popular and widely used quantitative techniques. A linear programming model provides an efficient method for determining an optimal decision chosen from a large number of possible decisions. The optimum decision is one that meets a specified objective of management, subject to various constraints and restrictions. The objective is mainly minimization of maintenance cost of roads in industrial area. In order to determine the objective function for analysis of distress model it is necessary to fix the realistic data into a formulation. Each type of repair is to be quantified in a number of stretches by considering 1000 m as one stretch. A stretch considered under study is having 3750 m length. The quantity has to be put into an objective function for maximizing the number of repairs in a stretch related to quantity. The distress observed in this stretch are potholes, surface cracks, rutting and ravelling. The distress data is measured manually by observing each distress level on a stretch of 1000 m. The maintenance and rehabilitation measured that are followed currently are based on subjective judgments. Hence, there is a need to adopt a scientific approach in order to effectively use the limited resources. It is also necessary to determine the pavement performance and deterioration prediction relationship with more accurate and economic benefits of road networks with respect to vehicle operating cost. The infrastructure of road network should have best results expected from available funds. In this paper objective function for distress model is determined by linear programming and deterioration model considering overloading is discussed.

Keywords: budget, maintenance, deterioration, priority

Procedia PDF Downloads 208
29150 Estimating Housing Prices Using Automatic Linear Modeling in the Metropolis of Mashhad, Iran

Authors: Mohammad Rahim Rahnama

Abstract:

Market-transaction price for housing is the main criteria for determining municipality taxes and is determined and announced on an annual basis. Of course, there is a discrepancy between the actual value of transactions in the Bureau of Finance (P for short) or municipality (P´ for short) and the real price on the market (P˝). The present research aims to determine the real price of housing in the metropolis of Mashhad and to pinpoint the price gap with those of the aforementioned apparatuses and identify the factors affecting it. In order to reach this practical objective, Automatic Linear Modeling, which calls for an explanatory research, was utilized. The population of the research consisted of all the residential units in Mashhad, from which 317 residential units were randomly selected. Through cluster sampling, out of the 170 income blocks defined by the municipality, three blocks form high-income (Kosar), middle-income (Elahieh), and low-income (Seyyedi) strata were surveyed using questionnaires during February and March of 2015 and the information regarding the price and specifications of residential units were gathered. In order to estimate the effect of various factors on the price, the relationship between independent variables (8 variables) and the dependent variable of the housing price was calculated using Automatic Linear Modeling in SPSS. The results revealed that the average for housing price index is 788$ per square meter, compared to the Bureau of Finance’s prices which is 10$ and that of municipality’s which is 378$. Correlation coefficient among dependent and independent variables was calculated to be R²=0.81. Out of the eight initial variables, three were omitted. The most influential factor affecting the housing prices is the quality of Quality of construction (Ordinary, Full, Luxury). The least important factor influencing the housing prices is the variable of number of sides. The price gap between low-income (Seyyedi) and middle-income (Elahieh) districts was not confirmed via One-Way ANOVA but their gap with the high-income district (Kosar) was confirmed. It is suggested that city be divided into two low-income and high-income sections, as opposed three, in terms of housing prices.

Keywords: automatic linear modeling, housing prices, Mashhad, Iran

Procedia PDF Downloads 257
29149 Timetabling for Interconnected LRT Lines: A Package Solution Based on a Real-world Case

Authors: Huazhen Lin, Ruihua Xu, Zhibin Jiang

Abstract:

In this real-world case, timetabling the LRT network as a whole is rather challenging for the operator: they are supposed to create a timetable to avoid various route conflicts manually while satisfying a given interval and the number of rolling stocks, but the outcome is not satisfying. Therefore, the operator adopts a computerised timetabling tool, the Train Plan Maker (TPM), to cope with this problem. However, with various constraints in the dual-line network, it is still difficult to find an adequate pairing of turnback time, interval and rolling stocks’ number, which requires extra manual intervention. Aiming at current problems, a one-off model for timetabling is presented in this paper to simplify the procedure of timetabling. Before the timetabling procedure starts, this paper presents how the dual-line system with a ring and several branches is turned into a simpler structure. Then, a non-linear programming model is presented in two stages. In the first stage, the model sets a series of constraints aiming to calculate a proper timing for coordinating two lines by adjusting the turnback time at termini. Then, based on the result of the first stage, the model introduces a series of inequality constraints to avoid various route conflicts. With this model, an analysis is conducted to reveal the relation between the ratio of trains in different directions and the possible minimum interval, observing that the more imbalance the ratio is, the less possible to provide frequent service under such strict constraints.

Keywords: light rail transit (LRT), non-linear programming, railway timetabling, timetable coordination

Procedia PDF Downloads 90
29148 The Effects of Human Activities on Plant Diversity in Tropical Wetlands of Lake Tana (Ethiopia)

Authors: Abrehet Kahsay Mehari

Abstract:

Aquatic plants provide the physical structure of wetlands and increase their habitat complexity and heterogeneity, and as such, have a profound influence on other biotas. In this study, we investigated how human disturbance activities influenced the species richness and community composition of aquatic plants in the wetlands of Lake Tana, Ethiopia. Twelve wetlands were selected: four lacustrine, four river mouths, and four riverine papyrus swamps. Data on aquatic plants, environmental variables, and human activities were collected during the dry and wet seasons of 2018. A linear mixed effect model and a distance-based Redundancy Analysis (db-RDA) were used to relate aquatic plant species richness and community composition, respectively, to human activities and environmental variables. A total of 113 aquatic plant species, belonging to 38 families, were identified across all wetlands during the dry and wet seasons. Emergent species had the maximum area covered at 73.45 % and attained the highest relative abundance, followed by amphibious and other forms. The mean taxonomic richness of aquatic plants was significantly lower in wetlands with high overall human disturbance scores compared to wetlands with low overall human disturbance scores. Moreover, taxonomic richness showed a negative correlation with livestock grazing, tree plantation, and sand mining. The community composition also varied across wetlands with varying levels of human disturbance and was primarily driven by turnover (i.e., replacement of species) rather than nestedness resultant(i.e., loss of species). Distance-based redundancy analysis revealed that livestock grazing, tree plantation, sand mining, waste dumping, and crop cultivation were significant predictors of variation in aquatic plant communities’ composition in the wetlands. Linear mixed effect models and distance-based redundancy analysis also revealed that water depth, turbidity, conductivity, pH, sediment depth, and temperature were important drivers of variations in aquatic plant species richness and community composition. Papyrus swamps had the highest species richness and supported different plant communities. Conservation efforts should therefore focus on these habitats and measures should be taken to restore the highly disturbed and species poor wetlands near the river mouths.

Keywords: species richness, community composition, aquatic plants, wetlands, Lake Tana, human disturbance activities

Procedia PDF Downloads 128
29147 Joint Replenishment and Heterogeneous Vehicle Routing Problem with Cyclical Schedule

Authors: Ming-Jong Yao, Chin-Sum Shui, Chih-Han Wang

Abstract:

This paper is developed based on a real-world decision scenario that an industrial gas company that applies the Vendor Managed Inventory model and supplies liquid oxygen with a self-operated heterogeneous vehicle fleet to hospitals in nearby cities. We name it as a Joint Replenishment and Heterogeneous Vehicle Routing Problem with Cyclical Schedule and formulate it as a non-linear mixed-integer linear programming problem which simultaneously determines the length of the planning cycle (PC), the length of the replenishment cycle and the dates of replenishment for each customer and the vehicle routes of each day within PC, such that the average daily operation cost within PC, including inventory holding cost, setup cost, transportation cost, and overtime labor cost, is minimized. A solution method based on genetic algorithm, embedded with an encoding and decoding mechanism and local search operators, is then proposed, and the hash function is adopted to avoid repetitive fitness evaluation for identical solutions. Numerical experiments demonstrate that the proposed solution method can effectively solve the problem under different lengths of PC and number of customers. The method is also shown to be effective in determining whether the company should expand the storage capacity of a customer whose demand increases. Sensitivity analysis of the vehicle fleet composition shows that deploying a mixed fleet can reduce the daily operating cost.

Keywords: cyclic inventory routing problem, joint replenishment, heterogeneous vehicle, genetic algorithm

Procedia PDF Downloads 88
29146 The Child Attachment Interview: A Psychometric Longitudinal Validation Study in a German Sample

Authors: Jorn Meyer, Stefan Sturmer

Abstract:

The assessment of attachment patterns in toddlers and adults has been well researched, and valid diagnostic methods (e.g., Strange Situation Test, Adult Attachment Interview) are applicable. For middle and late childhood, on the other hand, there are only few validated methods available so far. For the Child Attachment Interview (CAI) promising validation studies from English-speaking countries are available, but so far a comprehensive study on the validity of a German sample is lacking. Within the scope of a longitudinal project, the results of the first point of measurement are reported in this study. A German-language version of the CAI was carried out with 111 primary school children (56% female; age: M = 8.34, SD = 0.49). In relation to psychometric quality criteria, parameters on interrater reliability, construct validity and discriminant, and convergent validity are reported. Analyses of the correlations between attachment patterns and internalizing and externalizing behavior problems from parent and teacher reports are presented. The implications for the German-language assessment of attachment in middle and late childhood in research and individual case diagnostics, e.g., in the context of conducting expert evaluation reports for family courts, are discussed.

Keywords: attachment, attachment assessment, developmental psychology, longitudinal study

Procedia PDF Downloads 242
29145 Extreme Rainfall Frequency Analysis For Meteorological Sub-Division 4 Of India Using L-Moments.

Authors: Arti Devi, Parthasarthi Choudhury

Abstract:

Extreme rainfall frequency analysis for Meteorological Sub-Division 4 of India was analysed using L-moments approach. Serial Correlation and Mann Kendall tests were conducted for checking serially independent and stationarity of the observations. The discordancy measure for the sites was conducted to detect the discordant sites. The regional homogeneity was tested by comparing with 500 generated homogeneous regions using a 4 parameter Kappa distribution. The best fit distribution was selected based on ZDIST statistics and L-moments ratio diagram from the five extreme value distributions GPD, GLO, GEV, P3 and LP3. The LN3 distribution was selected and regional rainfall frequency relationship was established using index-rainfall procedure. A regional mean rainfall relationship was developed using multiple linear regression with latitude and longitude of the sites as variables.

Keywords: L-moments, ZDIST statistics, serial correlation, Mann Kendall test

Procedia PDF Downloads 441
29144 Kalman Filter Gain Elimination in Linear Estimation

Authors: Nicholas D. Assimakis

Abstract:

In linear estimation, the traditional Kalman filter uses the Kalman filter gain in order to produce estimation and prediction of the n-dimensional state vector using the m-dimensional measurement vector. The computation of the Kalman filter gain requires the inversion of an m x m matrix in every iteration. In this paper, a variation of the Kalman filter eliminating the Kalman filter gain is proposed. In the time varying case, the elimination of the Kalman filter gain requires the inversion of an n x n matrix and the inversion of an m x m matrix in every iteration. In the time invariant case, the elimination of the Kalman filter gain requires the inversion of an n x n matrix in every iteration. The proposed Kalman filter gain elimination algorithm may be faster than the conventional Kalman filter, depending on the model dimensions.

Keywords: discrete time, estimation, Kalman filter, Kalman filter gain

Procedia PDF Downloads 197
29143 Post-traumatic Checklist-5 (PCL-5) Psychometric Properties: Across Sectional Study Among Lebanese Population

Authors: Fadwa Alhalaiqa, Othman Alfuqaha, Anas H. Khalifeh, Mahmoud Alsaraireh, Rami Masa’Deh, Natija S Manaa

Abstract:

Background: Post-traumatic stress disorders (PTSD) usually occur after traumatic occurrences that exceed the range of common human experience. This study aimed to test the psychometric properties of PCL-5 checklist for the 20 PTSD symptoms from DSM-5 among Lebanese population and to identify the prevalence of PTSD. Methods: A cross sectional survey of PCL5 among 950 Lebanese using the online survey platform by Google form was conducted. Snowball recruitment was used to identify participants for the survey. STROBE guideline was used in reporting the current study. Results: Face content, construct, discriminant, and convergent validity had been accomplished of PCL-5. The reliability by Cronbach alpha, composite, and average variance extracted were set superior. We found also that more than half of the participants (55.6%) scored 33 or above, which is the cutoff score for a likely diagnosis of PTSD. Conclusion: The current study provides further support for the Arabic version PCL-5 validity and reliability among non-Western populations. This support using this tool in the screening of PTSD.

Keywords: post traumatic stress disorder, psychometric properties, stress, adult population

Procedia PDF Downloads 102
29142 Drastic Increase of Wave Dissipation within Metastructures Having Negative Stiffness Inclusions

Authors: D. Chronopoulos, I. Antoniadis, V. Spitas, D. Koulocheris, V. Polenta

Abstract:

A concept of a simple linear oscillator, incorporating a negative stiffness element is demonstrated to exhibit extraordinary damping properties. This oscillator shares the same overall (static) stiffness, the same mass and the same damping element with a reference classical linear SDOF oscillator. However, it differs from the original SDOF oscillator by appropriately redistributing the component spring stiffness elements and by re-allocating the damping element. Despite the fact that the proposed oscillator incorporates a negative stiffness element, it is designed to be both statically and dynamically stable. Once such an oscillator is optimally designed, it is shown to exhibit an extraordinary apparent damping ratio, which is even several orders of magnitude higher than that of the original SDOF system, especially in cases where the original damping of the SDOF system is low. This damping behavior is not a result of a novel additional extraordinary energy dissipation mechanism, but a result of the phase difference between the positive and the negative stiffness elastic forces, which is in turn a consequence of the proper re-distribution of the stiffness and the damper elements. This fact ensures that an adequate level of elastic forces exists throughout the entire frequency range, able to counteract the inertial and the excitation forces. Next, Acoustic or Phononic Meta-materials are considered, in which one atom is replaced by the concept of the above simple linear oscillator. The results indicate that not only the damping of the meta-material verifies and exceeds the one expected from the so-called "meta-damping" behavior, but also that the band gap of the meta-material can be significantly increased.

Keywords: wave propagation, periodic structures, wave damping, mechanical engineering

Procedia PDF Downloads 357
29141 Low-Level Modeling for Optimal Train Routing and Scheduling in Busy Railway Stations

Authors: Quoc Khanh Dang, Thomas Bourdeaud’huy, Khaled Mesghouni, Armand Toguy´eni

Abstract:

This paper studies a train routing and scheduling problem for busy railway stations. Our objective is to allow trains to be routed in dense areas that are reaching saturation. Unlike traditional methods that allocate all resources to setup a route for a train and until the route is freed, our work focuses on the use of resources as trains progress through the railway node. This technique allows a larger number of trains to be routed simultaneously in a railway node and thus reduces their current saturation. To deal with this problem, this study proposes an abstract model and a mixed-integer linear programming formulation to solve it. The applicability of our method is illustrated on a didactic example.

Keywords: busy railway stations, mixed-integer linear programming, offline railway station management, train platforming, train routing, train scheduling

Procedia PDF Downloads 254
29140 Similar Correlation of Meat and Sugar to Global Obesity Prevalence

Authors: Wenpeng You, Maciej Henneberg

Abstract:

Background: Sugar consumption has been overwhelmingly advocated as a major dietary offender to obesity prevalence. Meat intake has been hypothesized as an obesity contributor in previous publications, but a moderate amount of meat to be included in our daily diet still has been suggested in many dietary guidelines. Comparable sugar and meat exposure data were obtained to assess the difference in relationships between the two major food groups and obesity prevalence at population level. Methods: Population level estimates of obesity and overweight rates, per capita per day exposure of major food groups (meat, sugar, starch crops, fibers, fats and fruits) and total calories, per capita per year GDP, urbanization and physical inactivity prevalence rate were extracted and matched for statistical analysis. Correlation coefficient (Pearson and partial) comparisons with Fisher’s r-to-z transformation and β range (β ± 2 SE) and overlapping in multiple linear regression (Enter and Stepwise) were used to examine potential differences in the relationships between obesity prevalence and sugar exposure and meat exposure respectively. Results: Pearson and partial correlations (controlled for total calories, physical inactivity prevalence, GDP and urbanization) analyses revealed that sugar and meat exposures correlated to obesity and overweight prevalence significantly. Fisher's r-to-z transformation did not show statistically significant difference in Pearson correlation coefficients (z=-0.53, p=0.5961) or partial correlation coefficients (z=-0.04, p=0.9681) between obesity prevalence and both sugar exposure and meat exposure. Both Enter and Stepwise models in multiple linear regression analysis showed that sugar and meat exposure were most significant predictors of obesity prevalence. Great β range overlapping in the Enter (0.289-0.573) and Stepwise (0.294-0.582) models indicated statistically sugar and meat exposure correlated to obesity without significant difference. Conclusion: Worldwide sugar and meat exposure correlated to obesity prevalence at the same extent. Like sugar, minimal meat exposure should also be suggested in the dietary guidelines.

Keywords: meat, sugar, obesity, energy surplus, meat protein, fats, insulin resistance

Procedia PDF Downloads 306
29139 A High Linear and Low Power with 71dB 35.1MHz/4.38GHz Variable Gain Amplifier in 180nm CMOS Technology

Authors: Sina Mahdavi, Faeze Noruzpur, Aysuda Noruzpur

Abstract:

This paper proposes a high linear, low power and wideband Variable Gain Amplifier (VGA) with a direct current (DC) gain range of -10.2dB to 60.7dB. By applying the proposed idea to the folded cascade amplifier, it is possible to achieve a 71dB DC gain, 35MHz (-3dB) bandwidth, accompanied by high linearity and low sensitivity as well. It is noteworthy that the proposed idea can be able to apply on every differential amplifier, too. Moreover, the total power consumption and unity gain bandwidth of the proposed VGA is 1.41mW with a power supply of 1.8 volts and 4.37GHz, respectively, and 0.8pF capacitor load is applied at the output nodes of the amplifier. Furthermore, the proposed structure is simulated in whole process corners and different temperatures in the region of -60 to +90 ºC. Simulations are performed for all corner conditions by HSPICE using the BSIM3 model of the 180nm CMOS technology and MATLAB software.

Keywords: variable gain amplifier, low power, low voltage, folded cascade, amplifier, DC gain

Procedia PDF Downloads 119
29138 Banks Profitability Indicators in CEE Countries

Authors: I. Erins, J. Erina

Abstract:

The aim of the present article is to determine the impact of the external and internal factors of bank performance on the profitability indicators of the CEE countries banks in the period from 2006 to 2012. On the basis of research conducted abroad on bank and macroeconomic profitability indicators, in order to obtain research results, the authors evaluated return on average assets (ROAA) and return on average equity (ROAE) indicators of the CEE countries banks. The authors analyzed profitability indicators of banks using descriptive methods, SPSS data analysis methods as well as data correlation and linear regression analysis. The authors concluded that most internal and external indicators of bank performance have no direct effect on the profitability of the banks in the CEE countries. The only exceptions are credit risk and bank size which affect one of the measures of bank profitability–return on average equity.

Keywords: banks, CEE countries, profitability ROAA, ROAE

Procedia PDF Downloads 368
29137 Urinalysis by Surface-Enhanced Raman Spectroscopy on Gold Nanoparticles for Different Disease

Authors: Leonardo C. Pacheco-Londoño, Nataly J. Galan-Freyle, Lisandro Pacheco-Lugo, Antonio Acosta, Elkin Navarro, Gustavo Aroca-Martínez, Karin Rondón-Payares, Samuel P. Hernández-Rivera

Abstract:

In our Life Science Research Center of the University Simon Bolivar (LSRC), one of the focuses is the diagnosis and prognosis of different diseases; we have been implementing the use of gold nanoparticles (Au-NPs) for various biomedical applications. In this case, Au-NPs were used for Surface-Enhanced Raman Spectroscopy (SERS) in different diseases' diagnostics, such as Lupus Nephritis (LN), hypertension (H), preeclampsia (PC), and others. This methodology is proposed for the diagnosis of each disease. First, good signals of the different metabolites by SERS were obtained through a mixture of urine samples and Au-NPs. Second, PLS-DA models based on SERS spectra to discriminate each disease were able to differentiate between sick and healthy patients with different diseases. Finally, the sensibility and specificity for the different models were determined in the order of 0.9. On the other hand, a second methodology was developed using machine learning models from all data of the different diseases, and, as a result, a discriminant spectral map of the diseases was generated. These studies were possible thanks to joint research between two university research centers and two health sector entities, and the patient samples were treated with ethical rigor and their consent.

Keywords: SERS, Raman, PLS-DA, diseases

Procedia PDF Downloads 143
29136 Comparative Fragility Analysis of Shallow Tunnels Subjected to Seismic and Blast Loads

Authors: Siti Khadijah Che Osmi, Mohammed Ahmad Syed

Abstract:

Underground structures are crucial components which required detailed analysis and design. Tunnels, for instance, are massively constructed as transportation infrastructures and utilities network especially in urban environments. Considering their prime importance to the economy and public safety that cannot be compromised, thus any instability to these tunnels will be highly detrimental to their performance. Recent experience suggests that tunnels become vulnerable during earthquakes and blast scenarios. However, a very limited amount of studies has been carried out to study and understanding the dynamic response and performance of underground tunnels under those unpredictable extreme hazards. In view of the importance of enhancing the resilience of these structures, the overall aims of the study are to evaluate probabilistic future performance of shallow tunnels subjected to seismic and blast loads by developing detailed fragility analysis. Critical non-linear time history numerical analyses using sophisticated finite element software Midas GTS NX have been presented about the current methods of analysis, taking into consideration of structural typology, ground motion and explosive characteristics, effect of soil conditions and other associated uncertainties on the tunnel integrity which may ultimately lead to the catastrophic failure of the structures. The proposed fragility curves for both extreme loadings are discussed and compared which provide significant information the performance of the tunnel under extreme hazards which may beneficial for future risk assessment and loss estimation.

Keywords: fragility analysis, seismic loads, shallow tunnels, blast loads

Procedia PDF Downloads 344
29135 Microstructural Characterization and Mechanical Properties of Al-2Mn-5Fe Ternary Eutectic Alloy

Authors: Emin Çadirli, Izzettin Yilmazer, Uğur Büyük, Hasan Kaya

Abstract:

Al-2Mn-5Fe eutectic alloy (wt.%) was prepared in a graphite crucible under vacuum atmosphere. The samples were directionally solidified upward at a constant temperature gradient in four different of growth rates by using a Bridgman method. The values of eutectic spacing were measured from longitudinal and transverse sections of the samples. The dependence of eutectic spacing on the growth rate was determined by using linear regression analysis. The microhardness and tensile strength of the studied alloy also were measured from directionally solidified samples. The dependency of the microhardness and tensile strength for directionally solidified Al-2Mn-5Fe eutectic alloy on the growth rate were investigated and the relationships between them were experimentally obtained by using regression analysis. The results obtained in present work were compared with the previous similar experimental results obtained for binary and ternary alloys.

Keywords: eutectic alloy, microhardness, microstructure, tensile strength

Procedia PDF Downloads 473
29134 Factors Influencing Family Resilience and Quality of Life in Pediatric Cancer Patients and Their Caregivers: A Cluster Analysis

Authors: Li Wang, Dan Shu, Shiguang Pang, Lixiu Wang, Bing Xiang Yang, Qian Liu

Abstract:

Background: Cancer is one of the most severe diseases in childhood; long-term treatment and its side effects significantly impact the patient's physical, psychological, social functioning and quality of life while also placing substantial physical and psychological burdens on caregivers and families. Family resilience is crucial for children with cancer, helping them cope better with the disease and supporting the family in facing challenges together. As a family-level variable, family resilience requires information from multiple family members. However, to our best knowledge, there is currently no research investigating family resilience from both the perspectives of pediatric cancer patients and their caregivers. Therefore, this study aims to investigate the family resilience and quality of life of pediatric cancer patients from a patient–caregiver dyadic perspective. Methods: A total of 149 dyads of patients diagnosed with pediatric cancer patients and their principal caregivers were recruited from oncology departments of 4 tertiary hospitals in Wuhan and Taiyuan, China. All participants completed questionnaires that identified their demographic and clinical characteristics as well as assessed their family resilience and quality of life for both the patients and their caregivers. K-means cluster analysis was used to identify different clusters of family resilience based on the reports from patients and caregivers. Multivariate logistic regression and linear regression are used to analyze the factors influencing family resilience and quality of life, as well as the relationship between the two. Results: Three clusters of family resilience were identified: a cluster of high family resilience (HR), a cluster of low family resilience (LR), and a cluster of discrepant family resilience (DR). Most (67.1%) families fell into the cluster with low resilience. Characteristics such as the types of caregivers perceived social support of the patient were different among the three clusters. Compared to the LR group, families where the mother is the caregiver and where the patient has high social support are more likely to be assigned to the HR. The quality of life for caregivers was consistently highest in the HR cluster and lowest in the LR cluster. The patient's quality of life is not related to family resilience. In the linear regression analysis of the patient's quality of life, patients who are the first-born have higher quality of life, while those living with their parents have lower quality of life. The participants' characteristics were not associated with the quality of life for caregivers. Conclusions: In most families, family resilience was low. Families with maternal caregivers and patients receiving high levels of social support are more inclined to be higher levels of family resilience. Family resilience was linked to the quality of life of caregivers of pediatric cancer patients. The clinical implications of this findings suggest that healthcare and social support organizations should prioritize and support the participation of mothers in caregiving responsibilities. Furthermore, they should assist families in accessing social support to enhance family resilience. This study also emphasizes the importance of promoting family resilience for enhancing family health and happiness, as well as improving the quality of life for caregivers.

Keywords: pediatric cancer, cluster analysis, family resilience, quality of life

Procedia PDF Downloads 39
29133 Nonparametric Specification Testing for the Drift of the Short Rate Diffusion Process Using a Panel of Yields

Authors: John Knight, Fuchun Li, Yan Xu

Abstract:

Based on a new method of the nonparametric estimator of the drift function, we propose a consistent test for the parametric specification of the drift function in the short rate diffusion process using observations from a panel of yields. The test statistic is shown to follow an asymptotic normal distribution under the null hypothesis that the parametric drift function is correctly specified, and converges to infinity under the alternative. Taking the daily 7-day European rates as a proxy of the short rate, we use our test to examine whether the drift of the short rate diffusion process is linear or nonlinear, which is an unresolved important issue in the short rate modeling literature. The testing results indicate that none of the drift functions in this literature adequately captures the dynamics of the drift, but nonlinear specification performs better than the linear specification.

Keywords: diffusion process, nonparametric estimation, derivative security price, drift function and volatility function

Procedia PDF Downloads 369
29132 Working Towards More Sustainable Food Waste: A Circularity Perspective

Authors: Rocío González-Sánchez, Sara Alonso-Muñoz

Abstract:

Food waste implies an inefficient management of the final stages in the food supply chain. Referring to Sustainable Development Goals (SDGs) by United Nations, the SDG 12.3 proposes to halve per capita food waste at the retail and consumer level and to reduce food losses. In the linear system, food waste is disposed and, to a lesser extent, recovery or reused after consumption. With the negative effect on stocks, the current food consumption system is based on ‘produce, take and dispose’ which put huge pressure on raw materials and energy resources. Therefore, greater focus on the circular management of food waste will mitigate the environmental, economic, and social impact, following a Triple Bottom Line (TBL) approach and consequently the SDGs fulfilment. A mixed methodology is used. A total sample of 311 publications from Web of Science database were retrieved. Firstly, it is performed a bibliometric analysis by SciMat and VOSviewer software to visualise scientific maps about co-occurrence analysis of keywords and co-citation analysis of journals. This allows for the understanding of the knowledge structure about this field, and to detect research issues. Secondly, a systematic literature review is conducted regarding the most influential articles in years 2020 and 2021, coinciding with the most representative period under study. Thirdly, to support the development of this field it is proposed an agenda according to the research gaps identified about circular economy and food waste management. Results reveal that the main topics are related to waste valorisation, the application of waste-to-energy circular model and the anaerobic digestion process towards fossil fuels replacement. It is underlined that the use of food as a source of clean energy is receiving greater attention in the literature. There is a lack of studies about stakeholders’ awareness and training. In addition, available data would facilitate the implementation of circular principles for food waste recovery, management, and valorisation. The research agenda suggests that circularity networks with suppliers and customers need to be deepened. Technological tools for the implementation of sustainable business models, and greater emphasis on social aspects through educational campaigns are also required. This paper contributes on the application of circularity to food waste management by abandoning inefficient linear models. Shedding light about trending topics in the field guiding to scholars for future research opportunities.

Keywords: bibliometric analysis, circular economy, food waste management, future research lines

Procedia PDF Downloads 113
29131 Non-linear Analysis of Spontaneous EEG After Spinal Cord Injury: An Experimental Study

Authors: Jiangbo Pu, Hanhui Xu, Yazhou Wang, Hongyan Cui, Yong Hu

Abstract:

Spinal cord injury (SCI) brings great negative influence to the patients and society. Neurological loss in human after SCI is a major challenge in clinical. Instead, neural regeneration could have been seen in animals after SCI, and such regeneration could be retarded by blocking neural plasticity pathways, showing the importance of neural plasticity in functional recovery. Here we used sample entropy as an indicator of nonlinear dynamical in the brain to quantify plasticity changes in spontaneous EEG recordings of rats before and after SCI. The results showed that the entropy values were increased after the injury during the recovery in one week. The increasing tendency of sample entropy values is consistent with that of behavioral evaluation scores. It is indicated the potential application of sample entropy analysis for the evaluation of neural plasticity in spinal cord injury rat model.

Keywords: spinal cord injury (SCI), sample entropy, nonlinear, complex system, firing pattern, EEG, spontaneous activity, Basso Beattie Bresnahan (BBB) score

Procedia PDF Downloads 465
29130 A Bayesian Parameter Identification Method for Thermorheological Complex Materials

Authors: Michael Anton Kraus, Miriam Schuster, Geralt Siebert, Jens Schneider

Abstract:

Polymers increasingly gained interest in construction materials over the last years in civil engineering applications. As polymeric materials typically show time- and temperature dependent material behavior, which is accounted for in the context of the theory of linear viscoelasticity. Within the context of this paper, the authors show, that some polymeric interlayers for laminated glass can not be considered as thermorheologically simple as they do not follow a simple TTSP, thus a methodology of identifying the thermorheologically complex constitutive bahavioir is needed. ‘Dynamical-Mechanical-Thermal-Analysis’ (DMTA) in tensile and shear mode as well as ‘Differential Scanning Caliometry’ (DSC) tests are carried out on the interlayer material ‘Ethylene-vinyl acetate’ (EVA). A navoel Bayesian framework for the Master Curving Process as well as the detection and parameter identification of the TTSPs along with their associated Prony-series is derived and applied to the EVA material data. To our best knowledge, this is the first time, an uncertainty quantification of the Prony-series in a Bayesian context is shown. Within this paper, we could successfully apply the derived Bayesian methodology to the EVA material data to gather meaningful Master Curves and TTSPs. Uncertainties occurring in this process can be well quantified. We found, that EVA needs two TTSPs with two associated Generalized Maxwell Models. As the methodology is kept general, the derived framework could be also applied to other thermorheologically complex polymers for parameter identification purposes.

Keywords: bayesian parameter identification, generalized Maxwell model, linear viscoelasticity, thermorheological complex

Procedia PDF Downloads 264
29129 Evaluating Traffic Congestion Using the Bayesian Dirichlet Process Mixture of Generalized Linear Models

Authors: Ren Moses, Emmanuel Kidando, Eren Ozguven, Yassir Abdelrazig

Abstract:

This study applied traffic speed and occupancy to develop clustering models that identify different traffic conditions. Particularly, these models are based on the Dirichlet Process Mixture of Generalized Linear regression (DML) and change-point regression (CR). The model frameworks were implemented using 2015 historical traffic data aggregated at a 15-minute interval from an Interstate 295 freeway in Jacksonville, Florida. Using the deviance information criterion (DIC) to identify the appropriate number of mixture components, three traffic states were identified as free-flow, transitional, and congested condition. Results of the DML revealed that traffic occupancy is statistically significant in influencing the reduction of traffic speed in each of the identified states. Influence on the free-flow and the congested state was estimated to be higher than the transitional flow condition in both evening and morning peak periods. Estimation of the critical speed threshold using CR revealed that 47 mph and 48 mph are speed thresholds for congested and transitional traffic condition during the morning peak hours and evening peak hours, respectively. Free-flow speed thresholds for morning and evening peak hours were estimated at 64 mph and 66 mph, respectively. The proposed approaches will facilitate accurate detection and prediction of traffic congestion for developing effective countermeasures.

Keywords: traffic congestion, multistate speed distribution, traffic occupancy, Dirichlet process mixtures of generalized linear model, Bayesian change-point detection

Procedia PDF Downloads 294
29128 Development of Orbital TIG Welding Robot System for the Pipe

Authors: Dongho Kim, Sung Choi, Kyowoong Pee, Youngsik Cho, Seungwoo Jeong, Soo-Ho Kim

Abstract:

This study is about the orbital TIG welding robot system which travels on the guide rail installed on the pipe, and welds and tracks the pipe seam using the LVS (Laser Vision Sensor) joint profile data. The orbital welding robot system consists of the robot, welder, controller, and LVS. Moreover we can define the relationship between welding travel speed and wire feed speed, and we can make the linear equation using the maximum and minimum amount of weld metal. Using the linear equation we can determine the welding travel speed and the wire feed speed accurately corresponding to the area of weld captured by LVS. We applied this orbital TIG welding robot system to the stainless steel or duplex pipe on DSME (Daewoo Shipbuilding and Marine Engineering Co. Ltd.,) shipyard and the result of radiographic test is almost perfect. (Defect rate: 0.033%).

Keywords: adaptive welding, automatic welding, pipe welding, orbital welding, laser vision sensor, LVS, welding D/B

Procedia PDF Downloads 690
29127 Optimizing Network Latency with Fast Path Assignment for Incoming Flows

Authors: Qing Lyu, Hang Zhu

Abstract:

Various flows in the network require to go through different types of middlebox. The improper placement of network middlebox and path assignment for flows could greatly increase the network latency and also decrease the performance of network. Minimizing the total end to end latency of all the ows requires to assign path for the incoming flows. In this paper, the flow path assignment problem in regard to the placement of various kinds of middlebox is studied. The flow path assignment problem is formulated to a linear programming problem, which is very time consuming. On the other hand, a naive greedy algorithm is studied. Which is very fast but causes much more latency than the linear programming algorithm. At last, the paper presents a heuristic algorithm named FPA, which takes bottleneck link information and estimated bandwidth occupancy into consideration, and achieves near optimal latency in much less time. Evaluation results validate the effectiveness of the proposed algorithm.

Keywords: flow path, latency, middlebox, network

Procedia PDF Downloads 207
29126 The Role of Macroeconomic Condition and Volatility in Credit Risk: An Empirical Analysis of Credit Default Swap Index Spread on Structural Models in U.S. Market during Post-Crisis Period

Authors: Xu Wang

Abstract:

This research builds linear regressions of U.S. macroeconomic condition and volatility measures in the investment grade and high yield Credit Default Swap index spreads using monthly data from March 2009 to July 2016, to study the relationship between different dimensions of macroeconomy and overall credit risk quality. The most significant contribution of this research is systematically examining individual and joint effects of macroeconomic condition and volatility on CDX spreads by including macroeconomic time series that captures different dimensions of the U.S. economy. The industrial production index growth, non-farm payroll growth, consumer price index growth, 3-month treasury rate and consumer sentiment are introduced to capture the condition of real economic activity, employment, inflation, monetary policy and risk aversion respectively. The conditional variance of the macroeconomic series is constructed using ARMA-GARCH model and is used to measure macroeconomic volatility. The linear regression model is conducted to capture relationships between monthly average CDX spreads and macroeconomic variables. The Newey–West estimator is used to control for autocorrelation and heteroskedasticity in error terms. Furthermore, the sensitivity factor analysis and standardized coefficients analysis are conducted to compare the sensitivity of CDX spreads to different macroeconomic variables and to compare relative effects of macroeconomic condition versus macroeconomic uncertainty respectively. This research shows that macroeconomic condition can have a negative effect on CDX spread while macroeconomic volatility has a positive effect on determining CDX spread. Macroeconomic condition and volatility variables can jointly explain more than 70% of the whole variation of the CDX spread. In addition, sensitivity factor analysis shows that the CDX spread is the most sensitive to Consumer Sentiment index. Finally, the standardized coefficients analysis shows that both macroeconomic condition and volatility variables are important in determining CDX spread but macroeconomic condition category of variables have more relative importance in determining CDX spread than macroeconomic volatility category of variables. This research shows that the CDX spread can reflect the individual and joint effects of macroeconomic condition and volatility, which suggests that individual investors or government should carefully regard CDX spread as a measure of overall credit risk because the CDX spread is influenced by macroeconomy. In addition, the significance of macroeconomic condition and volatility variables, such as Non-farm Payroll growth rate and Industrial Production Index growth volatility suggests that the government, should pay more attention to the overall credit quality in the market when macroecnomy is low or volatile.

Keywords: autoregressive moving average model, credit spread puzzle, credit default swap spread, generalized autoregressive conditional heteroskedasticity model, macroeconomic conditions, macroeconomic uncertainty

Procedia PDF Downloads 167
29125 The Mediation Role of Loneliness in the Relationship between Interpersonal Trust and Empathy

Authors: Ghazal Doostmohammadi, Susan Rahimzadeh

Abstract:

Aim: This research aimed to investigate the relationship between empathy and interpersonal trust and recognize the mediating role of loneliness between them in both genders. Methods: With a correlational descriptive design, 192 university students (130 female and 62 male) responded to the questionnaires on “empathy quotient,” “loneliness,” and “interpersonal trust” tests. These tests were designed and validated by experts in the field. Data were analysed using Pearson correlation and path analysis, which is a statistical technique that uses standard linear regression equations to determine the degree of conformity of a theoretical causal model with reality. Results: The data analysis showed that there was no significant correlation between interpersonal trust, both with loneliness (t=0.169) and empathy (t=0.186), while there was a significant negative correlation (t=0.359) between empathy and loneliness. This means that there is an inverse correlation between empathy and loneliness. The path analysis confirmed the hypothesis of the research about the mediating role of loneliness between empathy and interpersonal trust. But gender did not play a role in this relationship. Conclusion: As an outcome, clinical professionals and education trainers should pay more attention to interpersonal trust as a basic need and try to recreate and shape it to prevent people's social breakdown, and on the other hand, self-disclosure training (especially in Men), expression of feelings and courage should be given double importance to prevent the consequences of loneliness.

Keywords: empathy, loneliness, interpersonal trust, gender

Procedia PDF Downloads 84
29124 Using Arellano-Bover/Blundell-Bond Estimator in Dynamic Panel Data Analysis – Case of Finnish Housing Price Dynamics

Authors: Janne Engblom, Elias Oikarinen

Abstract:

A panel dataset is one that follows a given sample of individuals over time, and thus provides multiple observations on each individual in the sample. Panel data models include a variety of fixed and random effects models which form a wide range of linear models. A special case of panel data models are dynamic in nature. A complication regarding a dynamic panel data model that includes the lagged dependent variable is endogeneity bias of estimates. Several approaches have been developed to account for this problem. In this paper, the panel models were estimated using the Arellano-Bover/Blundell-Bond Generalized method of moments (GMM) estimator which is an extension of the Arellano-Bond model where past values and different transformations of past values of the potentially problematic independent variable are used as instruments together with other instrumental variables. The Arellano–Bover/Blundell–Bond estimator augments Arellano–Bond by making an additional assumption that first differences of instrument variables are uncorrelated with the fixed effects. This allows the introduction of more instruments and can dramatically improve efficiency. It builds a system of two equations—the original equation and the transformed one—and is also known as system GMM. In this study, Finnish housing price dynamics were examined empirically by using the Arellano–Bover/Blundell–Bond estimation technique together with ordinary OLS. The aim of the analysis was to provide a comparison between conventional fixed-effects panel data models and dynamic panel data models. The Arellano–Bover/Blundell–Bond estimator is suitable for this analysis for a number of reasons: It is a general estimator designed for situations with 1) a linear functional relationship; 2) one left-hand-side variable that is dynamic, depending on its own past realizations; 3) independent variables that are not strictly exogenous, meaning they are correlated with past and possibly current realizations of the error; 4) fixed individual effects; and 5) heteroskedasticity and autocorrelation within individuals but not across them. Based on data of 14 Finnish cities over 1988-2012 differences of short-run housing price dynamics estimates were considerable when different models and instrumenting were used. Especially, the use of different instrumental variables caused variation of model estimates together with their statistical significance. This was particularly clear when comparing estimates of OLS with different dynamic panel data models. Estimates provided by dynamic panel data models were more in line with theory of housing price dynamics.

Keywords: dynamic model, fixed effects, panel data, price dynamics

Procedia PDF Downloads 1510
29123 Motion of a Dust Grain Type Particle in Binary Stellar Systems

Authors: Rajib Mia, Badam Singh Kushvah

Abstract:

In this present paper, we use the photogravitational version of the restricted three body problem (RTBP) in binary systems. In the photogravitational RTBP, an infinitesimal particle (dust grain) is moving under the gravitational attraction and radiation pressure from the two bigger primaries. The third particle does not affect the motion of two bigger primaries. The zero-velocity curves, zero-velocity surfaces and their projections on the plane are studied. We have used existing analytical method to solve the equations of motion. We have obtained the Lagrangian points in some binary stellar systems. It is found that mass reduction factor affects the Lagrangian points. The linear stability of Lagrangian points is studied and found that these points are unstable. Moreover, trajectories of the infinitesimal particle at the triangular points are studied.

Keywords: binary systems, Lagrangian points, linear stability, photogravitational RTBP, trajectories

Procedia PDF Downloads 257
29122 Using Simulation Modeling Approach to Predict USMLE Steps 1 and 2 Performances

Authors: Chau-Kuang Chen, John Hughes, Jr., A. Dexter Samuels

Abstract:

The prediction models for the United States Medical Licensure Examination (USMLE) Steps 1 and 2 performances were constructed by the Monte Carlo simulation modeling approach via linear regression. The purpose of this study was to build robust simulation models to accurately identify the most important predictors and yield the valid range estimations of the Steps 1 and 2 scores. The application of simulation modeling approach was deemed an effective way in predicting student performances on licensure examinations. Also, sensitivity analysis (a/k/a what-if analysis) in the simulation models was used to predict the magnitudes of Steps 1 and 2 affected by changes in the National Board of Medical Examiners (NBME) Basic Science Subject Board scores. In addition, the study results indicated that the Medical College Admission Test (MCAT) Verbal Reasoning score and Step 1 score were significant predictors of the Step 2 performance. Hence, institutions could screen qualified student applicants for interviews and document the effectiveness of basic science education program based on the simulation results.

Keywords: prediction model, sensitivity analysis, simulation method, USMLE

Procedia PDF Downloads 340