Search results for: best linear unbiased predictor
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3836

Search results for: best linear unbiased predictor

2366 The Effects of Human Activities on Plant Diversity in Tropical Wetlands of Lake Tana (Ethiopia)

Authors: Abrehet Kahsay Mehari

Abstract:

Aquatic plants provide the physical structure of wetlands and increase their habitat complexity and heterogeneity, and as such, have a profound influence on other biotas. In this study, we investigated how human disturbance activities influenced the species richness and community composition of aquatic plants in the wetlands of Lake Tana, Ethiopia. Twelve wetlands were selected: four lacustrine, four river mouths, and four riverine papyrus swamps. Data on aquatic plants, environmental variables, and human activities were collected during the dry and wet seasons of 2018. A linear mixed effect model and a distance-based Redundancy Analysis (db-RDA) were used to relate aquatic plant species richness and community composition, respectively, to human activities and environmental variables. A total of 113 aquatic plant species, belonging to 38 families, were identified across all wetlands during the dry and wet seasons. Emergent species had the maximum area covered at 73.45 % and attained the highest relative abundance, followed by amphibious and other forms. The mean taxonomic richness of aquatic plants was significantly lower in wetlands with high overall human disturbance scores compared to wetlands with low overall human disturbance scores. Moreover, taxonomic richness showed a negative correlation with livestock grazing, tree plantation, and sand mining. The community composition also varied across wetlands with varying levels of human disturbance and was primarily driven by turnover (i.e., replacement of species) rather than nestedness resultant(i.e., loss of species). Distance-based redundancy analysis revealed that livestock grazing, tree plantation, sand mining, waste dumping, and crop cultivation were significant predictors of variation in aquatic plant communities’ composition in the wetlands. Linear mixed effect models and distance-based redundancy analysis also revealed that water depth, turbidity, conductivity, pH, sediment depth, and temperature were important drivers of variations in aquatic plant species richness and community composition. Papyrus swamps had the highest species richness and supported different plant communities. Conservation efforts should therefore focus on these habitats and measures should be taken to restore the highly disturbed and species poor wetlands near the river mouths.

Keywords: species richness, community composition, aquatic plants, wetlands, Lake Tana, human disturbance activities

Procedia PDF Downloads 120
2365 Stoner Impurity Model in Nickel Hydride

Authors: Andrea Leon, J. M. Florez, P. Vargas

Abstract:

The effect of hydrogen adsorption on the magnetic properties of fcc Ni has been calculated using the linear-muffin-tin-orbital formalism and using the local-density approximation for the exchange y correlation. The calculations for the ground state show that the sequential addition of hydrogen atoms is found to monotonically reduce the total magnetic moment of the Ni fcc structure, as a result of changes in the exchange-splitting parameter and in the Fermi energy. In order to physically explain the effect of magnetization reduction as the Hydrogen concentration increases, we propose a Stoner impurity model to describe the influence of H impurity on the magnetic properties of Nickel.

Keywords: electronic structure, magnetic properties, Nickel hydride, stoner model

Procedia PDF Downloads 456
2364 Three-Stage Multivariate Stratified Sample Surveys with Probabilistic Cost Constraint and Random Variance

Authors: Sanam Haseen, Abdul Bari

Abstract:

In this paper a three stage multivariate programming problem with random survey cost and variances as random variables has been formulated as a non-linear stochastic programming problem. The problem has been converted into an equivalent deterministic form using chance constraint programming and modified E-modeling. An empirical study of the problem has been done at the end of the paper using R-simulation.

Keywords: chance constraint programming, modified E-model, stochastic programming, stratified sample surveys, three stage sample surveys

Procedia PDF Downloads 453
2363 An Analytical Method for Solving General Riccati Equation

Authors: Y. Pala, M. O. Ertas

Abstract:

In this paper, the general Riccati equation is analytically solved by a new transformation. By the method developed, looking at the transformed equation, whether or not an explicit solution can be obtained is readily determined. Since the present method does not require a proper solution for the general solution, it is especially suitable for equations whose proper solutions cannot be seen at first glance. Since the transformed second order linear equation obtained by the present transformation has the simplest form that it can have, it is immediately seen whether or not the original equation can be solved analytically. The present method is exemplified by several examples.

Keywords: Riccati equation, analytical solution, proper solution, nonlinear

Procedia PDF Downloads 352
2362 An Efficient Approach to Optimize the Cost and Profit of a Tea Garden by Using Branch and Bound Method

Authors: Abu Hashan Md Mashud, M. Sharif Uddin, Aminur Rahman Khan

Abstract:

In this paper, we formulate a new problem as a linear programming and Integer Programming problem and maximize profit within the limited budget and limited resources based on the construction of a tea garden problem. It describes a new idea about how to optimize profit and focuses on the practical aspects of modeling and the challenges of providing a solution to a complex real life problem. Finally, a comparative study is carried out among Graphical method, Simplex method and Branch and bound method.

Keywords: integer programming, tea garden, graphical method, simplex method, branch and bound method

Procedia PDF Downloads 621
2361 Estimates of Freshwater Content from ICESat-2 Derived Dynamic Ocean Topography

Authors: Adan Valdez, Shawn Gallaher, James Morison, Jordan Aragon

Abstract:

Global climate change has impacted atmospheric temperatures contributing to rising sea levels, decreasing sea ice, and increased freshening of high latitude oceans. This freshening has contributed to increased stratification inhibiting local mixing and nutrient transport and modifying regional circulations in polar oceans. In recent years, the Western Arctic has seen an increase in freshwater volume at an average rate of 397+-116 km3/year. The majority of the freshwater volume resides in the Beaufort Gyre surface lens driven by anticyclonic wind forcing, sea ice melt, and Arctic river runoff. The total climatological freshwater content is typically defined as water fresher than 34.8. The near-isothermal nature of Arctic seawater and non-linearities in the equation of state for near-freezing waters result in a salinity driven pycnocline as opposed to the temperature driven density structure seen in the lower latitudes. In this study, we investigate the relationship between freshwater content and remotely sensed dynamic ocean topography (DOT). In-situ measurements of freshwater content are useful in providing information on the freshening rate of the Beaufort Gyre; however, their collection is costly and time consuming. NASA’s Advanced Topographic Laser Altimeter System (ATLAS) derived dynamic ocean topography (DOT), and Air Expendable CTD (AXCTD) derived Freshwater Content are used to develop a linear regression model. In-situ data for the regression model is collected across the 150° West meridian, which typically defines the centerline of the Beaufort Gyre. Two freshwater content models are determined by integrating the freshwater volume between the surface and an isopycnal corresponding to reference salinities of 28.7 and 34.8. These salinities correspond to those of the winter pycnocline and total climatological freshwater content, respectively. Using each model, we determine the strength of the linear relationship between freshwater content and satellite derived DOT. The result of this modeling study could provide a future predictive capability of freshwater volume changes in the Beaufort-Chukchi Sea using non in-situ methods. Successful employment of the ICESat-2’s DOT approximation of freshwater content could potentially reduce reliance on field deployment platforms to characterize physical ocean properties.

Keywords: ICESat-2, dynamic ocean topography, freshwater content, beaufort gyre

Procedia PDF Downloads 77
2360 K-Means Based Matching Algorithm for Multi-Resolution Feature Descriptors

Authors: Shao-Tzu Huang, Chen-Chien Hsu, Wei-Yen Wang

Abstract:

Matching high dimensional features between images is computationally expensive for exhaustive search approaches in computer vision. Although the dimension of the feature can be degraded by simplifying the prior knowledge of homography, matching accuracy may degrade as a tradeoff. In this paper, we present a feature matching method based on k-means algorithm that reduces the matching cost and matches the features between images instead of using a simplified geometric assumption. Experimental results show that the proposed method outperforms the previous linear exhaustive search approaches in terms of the inlier ratio of matched pairs.

Keywords: feature matching, k-means clustering, SIFT, RANSAC

Procedia PDF Downloads 353
2359 A Sliding Model Control for a Hybrid Hyperbolic Dynamic System

Authors: Xuezhang Hou

Abstract:

In the present paper, a hybrid hyperbolic dynamic system formulated by partial differential equations with initial and boundary conditions is considered. First, the system is transformed to an abstract evolution system in an appropriate Hilbert space, and spectral analysis and semigroup generation of the system operator is discussed. Subsequently, a sliding model control problem is proposed and investigated, and an equivalent control method is introduced and applied to the system. Finally, a significant result that the state of the system can be approximated by an ideal sliding mode under control in any accuracy is derived and examined.

Keywords: hyperbolic dynamic system, sliding model control, semigroup of linear operators, partial differential equations

Procedia PDF Downloads 133
2358 Predictive Value Modified Sick Neonatal Score (MSNS) On Critically Ill Neonates Outcome Treated in Neonatal Intensive Care Unit (NICU)

Authors: Oktavian Prasetia Wardana, Martono Tri Utomo, Risa Etika, Kartika Darma Handayani, Dina Angelika, Wurry Ayuningtyas

Abstract:

Background: Critically ill neonates are newborn babies with high-risk factors that potentially cause disability and/or death. Scoring systems for determining the severity of the disease have been widely developed as well as some designs for use in neonates. The SNAPPE-II method, which has been used as a mortality predictor scoring system in several referral centers, was found to be slow in assessing the outcome of critically ill neonates in the Neonatal Intensive Care Unit (NICU). Objective: To analyze the predictive value of MSNS on the outcome of critically ill neonates at the time of arrival up to 24 hours after being admitted to the NICU. Methods: A longitudinal observational analytic study based on medical record data was conducted from January to August 2022. Each sample was recorded from medical record data, including data on gestational age, mode of delivery, APGAR score at birth, resuscitation measures at birth, duration of resuscitation, post-resuscitation ventilation, physical examination at birth (including vital signs and any congenital abnormalities), the results of routine laboratory examinations, as well as the neonatal outcomes. Results: This study involved 105 critically ill neonates who were admitted to the NICU. The outcome of critically ill neonates was 50 (47.6%) neonates died, and 55 (52.4%) neonates lived. There were more males than females (61% vs. 39%). The mean gestational age of the subjects in this study was 33.8 ± 4.28 weeks, with the mean birth weight of the subjects being 1820.31 ± 33.18 g. The mean MSNS score of neonates with a deadly outcome was lower than that of the lived outcome. ROC curve with a cut point MSNS score <10.5 obtained an AUC of 93.5% (95% CI: 88.3-98.6) with a sensitivity value of 84% (95% CI: 80.5-94.9), specificity 80 % (CI 95%: 88.3-98.6), Positive Predictive Value (PPV) 79.2%, Negative Predictive Value (NPV) 84.6%, Risk Ratio (RR) 5.14 with Hosmer & Lemeshow test results p>0.05. Conclusion: The MSNS score has a good predictive value and good calibration of the outcomes of critically ill neonates admitted to the NICU.

Keywords: critically ill neonate, outcome, MSNS, NICU, predictive value

Procedia PDF Downloads 66
2357 Parabolic Impact Law of High Frequency Exchanges on Price Formation in Commodities Market

Authors: L. Maiza, A. Cantagrel, M. Forestier, G. Laucoin, T. Regali

Abstract:

Evaluation of High Frequency Trading (HFT) impact on financial markets is very important for traders who use market analysis to detect winning transaction opportunity. Analysis of HFT data on tobacco commodity market is discussed here and interesting linear relationship has been shown between trading frequency and difference between averaged trading prices above and below considered trading frequency. This may open new perspectives on markets data understanding and could provide possible interpretation of Adam Smith invisible hand.

Keywords: financial market, high frequency trading, analysis, impacts, Adam Smith invisible hand

Procedia PDF Downloads 356
2356 Calculated Structural and Electronic Properties of Mg and Bi

Authors: G. Patricia Abdel Rahim, Jairo Arbey Rodriguez M, María Guadalupe Moreno Armenta

Abstract:

The present study shows the structural, electronic and magnetic properties of magnesium (Mg) and bismuth (Bi) in a supercell (1X1X5). For both materials were studied in five crystalline structures: rock salt (NaCl), cesium chloride (CsCl), zinc-blende (ZB), wurtzite (WZ), and nickel arsenide (NiAs), using the Density Functional Theory (DFT), the Generalized Gradient Approximation (GGA), and the Full Potential Linear Augmented Plane Wave (FP-LAPW) method. By means of fitting the Murnaghan's state equation we determine the lattice constant, the bulk modulus and it's derived with the pressure. Also we calculated the density of states (DOS) and the band structure.

Keywords: bismuth, magnesium, pseudo-potential, supercell

Procedia PDF Downloads 817
2355 First Rank Symptoms in Mania: An Indistinct Diagnostic Strand

Authors: Afshan Channa, Sameeha Aleem, Harim Mohsin

Abstract:

First rank symptoms (FRS) are considered to be pathognomic for Schizophrenia. However, FRS is not a distinctive feature of Schizophrenia. It has also been noticed in affective disorder, albeit not inclusive in diagnostic criteria. The presence of FRS in Mania leads to misdiagnosis of psychotic illness, further complicating the management and delay of appropriate treatment. FRS in Mania is associated with poor clinical and functional outcome. Its existence in the first episode of bipolar disorder may be a predictor of poor short-term outcome and decompensating course of illness. FRS in Mania is studied in west. However, the cultural divergence and detriments make it pertinent to study the frequency of FRS in affective disorder independently in Pakistan. Objective: The frequency of first rank symptoms in manic patients, who were under treatment at psychiatric services of tertiary care hospital. Method: The cross sectional study was done at psychiatric services of Aga Khan University Hospital, Karachi, Pakistan. One hundred and twenty manic patients were recruited from November 2014 to May 2015. The patients who were unable to comprehend Urdu or had comorbid psychiatric or organic disorder were excluded. FRS was assessed by administration of validated Urdu version of Present State Examination (PSE) tool. Result: The mean age of the patients was 37.62 + 12.51. The mean number of previous manic episode was 2.17 + 2.23. 11.2% males and 30.6% females had FRS. This association of first rank symptoms with gender in patients of mania was found to be significant with a p-value of 0.008. All-inclusive, 19.2% exhibited FRS in their course of illness. 43.5% had thought broadcasting, made feeling, impulses, action and somatic passivity. 39.1% had thought insertion, 30.4% had auditory perceptual distortion, and 17.4% had thought withdrawal. However, none displayed delusional perception. Conclusion: The study confirms the presence of FRS in mania in both male and female, irrespective of the duration of current manic illness or previous number of manic episodes. A substantial difference was established between both the genders. Being married had no protective effect on the presence of FRS.

Keywords: first rank symptoms, Mania, psychosis, present state examination

Procedia PDF Downloads 374
2354 Analyzing a Tourism System by Bifurcation Theory

Authors: Amin Behradfar

Abstract:

‎Tourism has a direct impact on the national revenue for all touristic countries. It creates work opportunities‎, ‎industries‎, ‎and several investments to serve and raise nations performance and cultures. ‎This paper is devoted to analyze dynamical behaviour of a four-dimensional non-linear tourism-based social-ecological system by using the codimension two bifurcation theory‎. ‎In fact we investigate the cusp bifurcation of that‎. ‎Implications of our mathematical results to the tourism‎ ‎industry are discussed‎. Moreover, profitability‎, ‎compatibility and sustainability of the tourism system are shown by the aid of cusp bifurcation and numerical techniques‎.

Keywords: tourism-based social-ecological dynamical systems, cusp bifurcation, center manifold theory, profitability, ‎compatibility, sustainability

Procedia PDF Downloads 500
2353 Life Time Improvement of Clamp Structural by Using Fatigue Analysis

Authors: Pisut Boonkaew, Jatuporn Thongsri

Abstract:

In hard disk drive manufacturing industry, the process of reducing an unnecessary part and qualifying the quality of part before assembling is important. Thus, clamp was designed and fabricated as a fixture for holding in testing process. Basically, testing by trial and error consumes a long time to improve. Consequently, the simulation was brought to improve the part and reduce the time taken. The problem is the present clamp has a low life expectancy because of the critical stress that occurred. Hence, the simulation was brought to study the behavior of stress and compressive force to improve the clamp expectancy with all probability of designs which are present up to 27 designs, which excluding the repeated designs. The probability was calculated followed by the full fractional rules of six sigma methodology which was provided correctly. The six sigma methodology is a well-structured method for improving quality level by detecting and reducing the variability of the process. Therefore, the defective will be decreased while the process capability increasing. This research focuses on the methodology of stress and fatigue reduction while compressive force still remains in the acceptable range that has been set by the company. In the simulation, ANSYS simulates the 3D CAD with the same condition during the experiment. Then the force at each distance started from 0.01 to 0.1 mm will be recorded. The setting in ANSYS was verified by mesh convergence methodology and compared the percentage error with the experimental result; the error must not exceed the acceptable range. Therefore, the improved process focuses on degree, radius, and length that will reduce stress and still remain in the acceptable force number. Therefore, the fatigue analysis will be brought as the next process in order to guarantee that the lifetime will be extended by simulating through ANSYS simulation program. Not only to simulate it, but also to confirm the setting by comparing with the actual clamp in order to observe the different of fatigue between both designs. This brings the life time improvement up to 57% compared with the actual clamp in the manufacturing. This study provides a precise and trustable setting enough to be set as a reference methodology for the future design. Because of the combination and adaptation from the six sigma method, finite element, fatigue and linear regressive analysis that lead to accurate calculation, this project will able to save up to 60 million dollars annually.

Keywords: clamp, finite element analysis, structural, six sigma, linear regressive analysis, fatigue analysis, probability

Procedia PDF Downloads 232
2352 Investigation on Behavior of Fixed-Ended Reinforced Concrete Deep Beams

Authors: Y. Heyrani Birak, R. Hizaji, J. Shahkarami

Abstract:

Reinforced Concrete (RC) deep beams are special structural elements because of their geometry and behavior under loads. For example, assumption of strain- stress distribution is not linear in the cross section. These types of beams may have simple supports or fixed supports. A lot of research works have been conducted on simply supported deep beams, but little study has been done in the fixed-end RC deep beams behavior. Recently, using of fixed-ended deep beams has been widely increased in structures. In this study, the behavior of fixed-ended deep beams is investigated, and the important parameters in capacity of this type of beams are mentioned.

Keywords: deep beam, capacity, reinforced concrete, fixed-ended

Procedia PDF Downloads 332
2351 The Gender Criteria of Film Criticism: Creating the ‘Big’, Avoiding the Important

Authors: Eleni Karasavvidou

Abstract:

Social and anthropological research, parallel to Gender Studies, highlighted the relationship between social structures and symbolic forms as an important field of interaction and recording of 'social trends.' Since the study of representations can contribute to the understanding of the social functions and power relations, they encompass. This ‘mirage,’ however, has not only to do with the representations themselves but also with the ways they are received and the film or critical narratives that are established as dominant or alternative. Cinema and the criticism of its cultural products are no exception. Even in the rapidly changing media landscape of the 21st century, movies remain an integral and widespread part of popular culture, making films an extremely powerful means of 'legitimizing' or 'delegitimizing' visions of domination and commonsensical gender stereotypes throughout society. And yet it is film criticism, the 'language per se,' that legitimizes, reinforces, rewards and reproduces (or at least ignores) the stereotypical depictions of female roles that remain common in the realm of film images. This creates the need for this issue to have emerged (also) in academic research questioning gender criteria in film reviews as part of the effort for an inclusive art and society. Qualitative content analysis is used to examine female roles in selected Oscar-nominated films against their reviews from leading websites and newspapers. This method was chosen because of the complex nature of the depictions in the films and the narratives they evoke. The films were divided into basic scenes depicting social functions, such as love and work relationships, positions of power and their function, which were analyzed by content analysis, with borrowings from structuralism (Gennette) and the local/universal images of intercultural philology (Wierlacher). In addition to the measurement of the general ‘representation-time’ by gender, other qualitative characteristics were also analyzed, such as: speaking time, sayings or key actions, overall quality of the character's action in relation to the development of the scenario and social representations in general, as well as quantitatively (insufficient number of female lead roles, fewer key supporting roles, relatively few female directors and people in the production chain and how they might affect screen representations. The quantitative analysis in this study was used to complement the qualitative content analysis. Then the focus shifted to the criteria of film criticism and to the rhetorical narratives that exclude or highlight in relation to gender identities and functions. In the criteria and language of film criticism, stereotypes are often reproduced or allegedly overturned within the framework of apolitical "identity politics," which mainly addresses the surface of a self-referential cultural-consumer product without connecting it more deeply with the material and cultural life. One of the prime examples of this failure is the Bechtel Test, which tracks whether female characters speak in a film regardless of whether women's stories are represented or not in the films analyzed. If perceived unbiased male filmmakers still fail to tell truly feminist stories, the same is the case with the criteria of criticism and the related interventions.

Keywords: representations, context analysis, reviews, sexist stereotypes

Procedia PDF Downloads 82
2350 View Synthesis of Kinetic Depth Imagery for 3D Security X-Ray Imaging

Authors: O. Abusaeeda, J. P. O. Evans, D. Downes

Abstract:

We demonstrate the synthesis of intermediary views within a sequence of X-ray images that exhibit depth from motion or kinetic depth effect in a visual display. Each synthetic image replaces the requirement for a linear X-ray detector array during the image acquisition process. Scale invariant feature transform, SIFT, in combination with epipolar morphing is employed to produce synthetic imagery. Comparison between synthetic and ground truth images is reported to quantify the performance of the approach. Our work is a key aspect in the development of a 3D imaging modality for the screening of luggage at airport checkpoints. This programme of research is in collaboration with the UK Home Office and the US Dept. of Homeland Security.

Keywords: X-ray, kinetic depth, KDE, view synthesis

Procedia PDF Downloads 262
2349 Implementation of Performance Management and Development System: The Case of the Eastern Cape Provincial Department of Health, South Africa

Authors: Thanduxolo Elford Fana

Abstract:

Rationale and Purpose: Performance management and development system are central to effective and efficient service delivery, especially in highly labour intensive sectors such as South African public health. Performance management and development systems seek to ensure that good employee performance is rewarded accordingly, while those who underperform are developed so that they can reach their full potential. An effective and efficiently implemented performance management system motivates and improves employee engagement. The purpose of this study is to examine the implementation of the performance management and development system and the challenges that are encountered during its implementation in the Eastern Cape Provincial Department of Health. Methods: A qualitative research approach and a case study design was adopted in this study. The primary data were collected through observations, focus group discussions with employees, a group interview with shop stewards, and in-depth interviews with supervisors and managers, from April 2019 to September 2019. There were 45 study participants. In-depth interviews were held with 10 managers at facility level, which included chief executive officer, chief medical officer, assistant director’s in human resources management, patient admin, operations, finance, and two area manager and two operation managers nursing. A group interview was conducted with five shop stewards and an in-depth interview with one shop steward from the group. Five focus group discussions were conducted with clinical and non-clinical staff. The focus group discussions were supplemented with an in-depth interview with one person from each group in order to counter the group effect. Observations included moderation committee, contracting, and assessment meetings. Findings: The study shows that the performance management and development system was not properly implemented. There was non-compliance to performance management and development system policy guidelines in terms of time lines for contracting, evaluation, payment of incentives to good performers, and management of poor performance. The study revealed that the system is ineffective in raising the performance of employees and unable to assist employees to grow. The performance bonuses were no longer paid to qualifying employees. The study also revealed that lack of capacity and commitment, poor communication, constant policy changes, financial constraints, weak and highly bureaucratic management structures, union interference were challenges that were encountered during the implementation of the performance management and development system. Lastly, employees and supervisors were rating themselves three irrespective of how well or bad they performed. Conclusion: Performance management is regarded as vital to improved performance of the health workforce and healthcare service delivery among populations. Effective implementation of performance management and development system depends on well-capacitated and unbiased management at facility levels. Therefore, there is an urgent need to improve communication, link performance management to rewards, and capacitate staff on performance management and development system, as it is key to improved public health sector outcomes or performance.

Keywords: challenges, implementation, performance management and development system, public hospital

Procedia PDF Downloads 132
2348 Assessment of Level of Sedation and Associated Factors Among Intubated Critically Ill Children in Pediatric Intensive Care Unit of Jimma University Medical Center: A Fourteen Months Prospective Observation Study, 2023

Authors: Habtamu Wolde Engudai

Abstract:

Background: Sedation can be provided to facilitate a procedure or to stabilize patients admitted in pediatric intensive care unit (PICU). Sedation is often necessary to maintain optimal care for critically ill children requiring mechanical ventilation. However, if sedation is too deep or too light, it has its own adverse effects, and hence, it is important to monitor the level of sedation and maintain an optimal level. Objectives: The objective is to assess the level of sedation and associated factors among intubated critically ill children admitted to PICU of JUMC, Jimma. Methods: A prospective observation study was conducted in the PICU of JUMC in September 2021 in 105 patients who were going to be admitted to the PICU aged less than 14 and with GCS >8. Data was collected by residents and nurses working in PICU. Data entry was done by Epi data manager (version 4.6.0.2). Statistical analysis and the creation of charts is going to be performed using SPSS version 26. Data was presented as mean, percentage and standard deviation. The assumption of logistic regression and the result of the assumption will be checked. To find potential predictors, bi-variable logistic regression was used for each predictor and outcome variable. A p value of <0.05 was considered as statistically significant. Finally, findings have been presented using figures, AOR, percentages, and a summary table. Result: in this study, 105 critically ill children had been involved who were started on continuous or intermittent forms of sedative drugs. Sedation level was assessed using a comfort scale three times per day. Based on this observation, we got a 44.8% level of suboptimal sedation at the baseline, a 36.2% level of suboptimal sedation at eight hours, and a 24.8% level of suboptimal sedation at sixteen hours. There is a significant association between suboptimal sedation and duration of stay with mechanical ventilation and the rate of unplanned extubation, which was shown by P < 0.05 using the Hosmer-Lemeshow test of goodness of fit (p> 0.44).

Keywords: level of sedation, critically ill children, Pediatric intensive care unit, Jimma university

Procedia PDF Downloads 59
2347 Investigation on Electronic and Magnetic Properties of Transition Metals Doped Zinc Selenide

Authors: S. Bentata, W. Benstaali, A. Abbad, H. A. Bentounes, B. Bouadjemi

Abstract:

The full potential linear augmented plane wave (FPLAPW) based on density-functional theory (DFT) is employed to study the electronic, magnetic and optical properties of some transition metals doped ZnSe. Calculations are carried out by varying the doped atoms. Four 3D transition elements were used as a dopant: Cr, Mn, Co and Cu in order to induce spin polarization. Our results show that, Mn and Cu-doped ZnSe could be used in spintronic devices only if additional dopants are introduced, on the contrary, transition elements showing delocalized quality such as Cr, and Co doped ZnSe might be promising candidates for application in spintronic.

Keywords: spin-up, spin-down, magnetic properties, transition metal, composite materials

Procedia PDF Downloads 269
2346 Co-Integrated Commodity Forward Pricing Model

Authors: F. Boudet, V. Galano, D. Gmira, L. Munoz, A. Reina

Abstract:

Commodities pricing needs a specific approach as they are often linked to each other and so are expectedly doing their prices. They are called co-integrated when at least one stationary linear combination exists between them. Though widespread in economic literature, and even if many equilibrium relations and co-movements exist in the economy, this principle of co-movement is not developed in derivatives field. The present study focuses on the following problem: How can the price of a forward agreement on a commodity be simulated, when it is co-integrated with other ones? Theoretical analysis is developed from Gibson-Schwartz model and an analytical solution is given for short maturities contracts and under risk-neutral conditions. The application has been made to crude oil and heating oil energy commodities and result confirms the applicability of proposed method.

Keywords: co-integration, commodities, forward pricing, Gibson-Schwartz

Procedia PDF Downloads 279
2345 Momentum Profits and Investor Behavior

Authors: Aditya Sharma

Abstract:

Profits earned from relative strength strategy of zero-cost portfolio i.e. taking long position in winner stocks and short position in loser stocks from recent past are termed as momentum profits. In recent times, there has been lot of controversy and concern about sources of momentum profits, since the existence of these profits acts as an evidence of earning non-normal returns from publicly available information directly contradicting Efficient Market Hypothesis. Literature review reveals conflicting theories and differing evidences on sources of momentum profits. This paper aims at re-examining the sources of momentum profits in Indian capital markets. The study focuses on assessing the effect of fundamental as well as behavioral sources in order to understand the role of investor behavior in stock returns and suggest (if any) improvements to existing behavioral asset pricing models. This Paper adopts calendar time methodology to calculate momentum profits for 6 different strategies with and without skipping a month between ranking and holding period. For each J/K strategy, under this methodology, at the beginning of each month t stocks are ranked on past j month’s average returns and sorted in descending order. Stocks in upper decile are termed winners and bottom decile as losers. After ranking long and short positions are taken in winner and loser stocks respectively and both portfolios are held for next k months, in such manner that at any given point of time we have K overlapping long and short portfolios each, ranked from t-1 month to t-K month. At the end of period, returns of both long and short portfolios are calculated by taking equally weighted average across all months. Long minus short returns (LMS) are momentum profits for each strategy. Post testing for momentum profits, to study the role market risk plays in momentum profits, CAPM and Fama French three factor model adjusted LMS returns are calculated. In the final phase of studying sources, decomposing methodology has been used for breaking up the profits into unconditional means, serial correlations, and cross-serial correlations. This methodology is unbiased, can be used with the decile-based methodology and helps to test the effect of behavioral and fundamental sources altogether. From all the analysis, it was found that momentum profits do exist in Indian capital markets with market risk playing little role in defining them. Also, it was observed that though momentum profits have multiple sources (risk, serial correlations, and cross-serial correlations), cross-serial correlations plays a major role in defining these profits. The study revealed that momentum profits do have multiple sources however, cross-serial correlations i.e. the effect of returns of other stocks play a major role. This means that in addition to studying the investors` reactions to the information of the same firm it is also important to study how they react to the information of other firms. The analysis confirms that investor behavior does play an important role in stock returns and incorporating both the aspects of investors’ reactions in behavioral asset pricing models help make then better.

Keywords: investor behavior, momentum effect, sources of momentum, stock returns

Procedia PDF Downloads 301
2344 Modelling Sudden Deaths from Myocardial Infarction and Stroke

Authors: Y. S. Yusoff, G. Streftaris, H. R Waters

Abstract:

Death within 30 days is an important factor to be looked into, as there is a significant risk of deaths immediately following or soon after, Myocardial Infarction (MI) or stroke. In this paper, we will model the deaths within 30 days following a Myocardial Infarction (MI) or stroke in the UK. We will see how the probabilities of sudden deaths from MI or stroke have changed over the period 1981-2000. We will model the sudden deaths using a Generalized Linear Model (GLM), fitted using the R statistical package, under a Binomial distribution for the number of sudden deaths. We parameterize our model using the extensive and detailed data from the Framingham Heart Study, adjusted to match UK rates. The results show that there is a reduction for the sudden deaths following a MI over time but no significant improvement for sudden deaths following a stroke.

Keywords: sudden deaths, myocardial infarction, stroke, ischemic heart disease

Procedia PDF Downloads 284
2343 Speed up Vector Median Filtering by Quasi Euclidean Norm

Authors: Vinai K. Singh

Abstract:

For reducing impulsive noise without degrading image contours, median filtering is a powerful tool. In multiband images as for example colour images or vector fields obtained by optic flow computation, a vector median filter can be used. Vector median filters are defined on the basis of a suitable distance, the best performing distance being the Euclidean. Euclidean distance is evaluated by using the Euclidean norms which is quite demanding from the point of view of computation given that a square root is required. In this paper an optimal piece-wise linear approximation of the Euclidean norm is presented which is applied to vector median filtering.

Keywords: euclidean norm, quasi euclidean norm, vector median filtering, applied mathematics

Procedia PDF Downloads 469
2342 Sub-Optimum Safety Performance of a Construction Project: A Multilevel Exploration

Authors: Tas Yong Koh, Steve Rowlinson, Yuzhong Shen

Abstract:

In construction safety management, safety climate has long been linked to workers' safety behaviors and performance. For this reason, safety climate concept and tools have been used as heuristics to diagnose a range of safety-related issues by some progressive contractors in Hong Kong and elsewhere. However, as a diagnostic tool, safety climate tends to treat the different components of the climate construct in a linear fashion. Safety management in construction projects, in reality, is a multi-faceted and multilevel phenomenon that resembles a complex system. Hence, understanding safety management in construction projects requires not only the understanding of safety climate but also the organizational-systemic nature of the phenomenon. Our involvement, diagnoses, and interpretations of a range of safety climate-related issues which culminated in the project’s sub-optimum safety performance in an infrastructure construction project have brought about such revelation. In this study, a range of data types had been collected from various hierarchies of the project site organization. These include the frontline workers and supervisors from the main and sub-contractors, and the client supervisory personnel. Data collection was performed through the administration of safety climate questionnaire, interviews, observation, and document study. The findings collectively indicate that what had emerged in parallel of the seemingly linear climate-based exploration is the exposition of the organization-systemic nature of the phenomenon. The results indicate the negative impacts of climate perceptions mismatch, insufficient work planning, and risk management, mixed safety leadership, workforce negative attributes, lapsed safety enforcement and resources shortages collectively give rise to the project sub-optimum safety performance. From the dynamic causation and multilevel perspective, the analyses show that the individual, group, and organizational levels issues are interrelated and these interrelationships are linked to negative safety climate. Hence the adoption of both perspectives has enabled a fuller understanding of the phenomenon of safety management that point to the need for an organizational-systemic intervention strategy. The core message points to the fact that intervention at an individual level will only meet with limited success if the risks embedded in the higher levels in group and project organization are not addressed. The findings can be used to guide the effective development of safety infrastructure by linking different levels of systems in a construction project organization.

Keywords: construction safety management, dynamic causation, multilevel analysis, safety climate

Procedia PDF Downloads 170
2341 Conductive and Stretchable Graphene Nanoribbon Coated Textiles

Authors: Lu Gan, Songmin Shang, Marcus Chun Wah Yuen

Abstract:

A conductive and stretchable cotton fabric was prepared in this study through coating the graphene nanoribbon onto the cotton fabric. The mechanical and electrical properties of the prepared cotton fabric were then investigated. As shown in the results, the graphene nanoribbon coated cotton fabric had an improvement in both mechanical strength and electrical conductivity. Moreover, the resistance of the cotton fabric had a linear dependence on the strain applied to it. The prepared graphene nanoribbon coated cotton fabric has great application potentials in smart textile industry.

Keywords: conductive fabric, graphene nanoribbon, coating, enhanced properties

Procedia PDF Downloads 351
2340 Discerning Divergent Nodes in Social Networks

Authors: Mehran Asadi, Afrand Agah

Abstract:

In data mining, partitioning is used as a fundamental tool for classification. With the help of partitioning, we study the structure of data, which allows us to envision decision rules, which can be applied to classification trees. In this research, we used online social network dataset and all of its attributes (e.g., Node features, labels, etc.) to determine what constitutes an above average chance of being a divergent node. We used the R statistical computing language to conduct the analyses in this report. The data were found on the UC Irvine Machine Learning Repository. This research introduces the basic concepts of classification in online social networks. In this work, we utilize overfitting and describe different approaches for evaluation and performance comparison of different classification methods. In classification, the main objective is to categorize different items and assign them into different groups based on their properties and similarities. In data mining, recursive partitioning is being utilized to probe the structure of a data set, which allow us to envision decision rules and apply them to classify data into several groups. Estimating densities is hard, especially in high dimensions, with limited data. Of course, we do not know the densities, but we could estimate them using classical techniques. First, we calculated the correlation matrix of the dataset to see if any predictors are highly correlated with one another. By calculating the correlation coefficients for the predictor variables, we see that density is strongly correlated with transitivity. We initialized a data frame to easily compare the quality of the result classification methods and utilized decision trees (with k-fold cross validation to prune the tree). The method performed on this dataset is decision trees. Decision tree is a non-parametric classification method, which uses a set of rules to predict that each observation belongs to the most commonly occurring class label of the training data. Our method aggregates many decision trees to create an optimized model that is not susceptible to overfitting. When using a decision tree, however, it is important to use cross-validation to prune the tree in order to narrow it down to the most important variables.

Keywords: online social networks, data mining, social cloud computing, interaction and collaboration

Procedia PDF Downloads 153
2339 Design and Production of Thin-Walled UHPFRC Footbridge

Authors: P. Tej, P. Kněž, M. Blank

Abstract:

The paper presents design and production of thin-walled U-profile footbridge made of UHPFRC. The main structure of the bridge is one prefabricated shell structure made of UHPFRC with dispersed steel fibers without any conventional reinforcement. The span of the bridge structure is 10 m and the clear width of 1.5 m. The thickness of the UHPFRC shell structure oscillated in an interval of 30-45 mm. Several calculations were made during the bridge design and compared with the experiments. For the purpose of verifying the calculations, a segment of 1.5 m was first produced, followed by the whole footbridge for testing. After the load tests were done, the design was optimized to cast the final footbridge.

Keywords: footbridge, non-linear analysis, shell structure, UHPFRC, Ultra-High Performance Fibre Reinforced Concrete

Procedia PDF Downloads 226
2338 Using Arellano-Bover/Blundell-Bond Estimator in Dynamic Panel Data Analysis – Case of Finnish Housing Price Dynamics

Authors: Janne Engblom, Elias Oikarinen

Abstract:

A panel dataset is one that follows a given sample of individuals over time, and thus provides multiple observations on each individual in the sample. Panel data models include a variety of fixed and random effects models which form a wide range of linear models. A special case of panel data models are dynamic in nature. A complication regarding a dynamic panel data model that includes the lagged dependent variable is endogeneity bias of estimates. Several approaches have been developed to account for this problem. In this paper, the panel models were estimated using the Arellano-Bover/Blundell-Bond Generalized method of moments (GMM) estimator which is an extension of the Arellano-Bond model where past values and different transformations of past values of the potentially problematic independent variable are used as instruments together with other instrumental variables. The Arellano–Bover/Blundell–Bond estimator augments Arellano–Bond by making an additional assumption that first differences of instrument variables are uncorrelated with the fixed effects. This allows the introduction of more instruments and can dramatically improve efficiency. It builds a system of two equations—the original equation and the transformed one—and is also known as system GMM. In this study, Finnish housing price dynamics were examined empirically by using the Arellano–Bover/Blundell–Bond estimation technique together with ordinary OLS. The aim of the analysis was to provide a comparison between conventional fixed-effects panel data models and dynamic panel data models. The Arellano–Bover/Blundell–Bond estimator is suitable for this analysis for a number of reasons: It is a general estimator designed for situations with 1) a linear functional relationship; 2) one left-hand-side variable that is dynamic, depending on its own past realizations; 3) independent variables that are not strictly exogenous, meaning they are correlated with past and possibly current realizations of the error; 4) fixed individual effects; and 5) heteroskedasticity and autocorrelation within individuals but not across them. Based on data of 14 Finnish cities over 1988-2012 differences of short-run housing price dynamics estimates were considerable when different models and instrumenting were used. Especially, the use of different instrumental variables caused variation of model estimates together with their statistical significance. This was particularly clear when comparing estimates of OLS with different dynamic panel data models. Estimates provided by dynamic panel data models were more in line with theory of housing price dynamics.

Keywords: dynamic model, fixed effects, panel data, price dynamics

Procedia PDF Downloads 1491
2337 Design of a Chaotic Trajectory Generator Algorithm for Mobile Robots

Authors: J. J. Cetina-Denis, R. M. López-Gutiérrez, R. Ramírez-Ramírez, C. Cruz-Hernández

Abstract:

This work addresses the problem of designing an algorithm capable of generating chaotic trajectories for mobile robots. Particularly, the chaotic behavior is induced in the linear and angular velocities of a Khepera III differential mobile robot by infusing them with the states of the H´enon chaotic map. A possible application, using the properties of chaotic systems, is patrolling a work area. In this work, numerical and experimental results are reported and analyzed. In addition, two quantitative numerical tests are applied in order to measure how chaotic the generated trajectories really are.

Keywords: chaos, chaotic trajectories, differential mobile robot, Henon map, Khepera III robot, patrolling applications

Procedia PDF Downloads 304