Search results for: linear eigenvalue analysis
28644 Impact of Teacher’s Behavior in Class Room on Socialization and Mental Health of School Children: A Student’s Perspective
Authors: Umaiza Bashir, Ushna Farukh
Abstract:
The present study examined the perspective of school students regarding teacher’s behavioral pattern during a teaching in classroom and its influence on the students’ socialization particularly forming peer relationships with the development of emotional, behavioral problems in school children. To study these dimension of teacher-student classroom relationship, 210 school children (105 girls and 105 boys) within the age range of 14 to 18 years were taken from the government, private schools. The cross-sectional research design was used in which stratified random sampling was done. Teacher-student interaction scale was used to assess the teacher-student relationship in the classroom, which had two factors such as positive and negative interaction. Peer relationship scale was administered to investigate the socialization of students, and School Children Problem Scale was also given to the participants to explore their emotional, behavioral issues. The analysis of Pearson correlation showed that there is a significant positive relationship between negative teacher-student interaction and student’s emotional-behavioral as well as social problems. Another analysis of t-test revealed that boys perceived more positive interaction with teachers than girls (p < 0.01). Girls showed more emotional behavioral problems than boys (p < 0.001) Linear regression explained that age, gender, negative teacher’s interaction with students and victimization in social gathering predicts mental health problems in school children. This study suggests and highlights the need for the school counselors for the better mental health of students and teachers.Keywords: teacher-student interaction, school psychology, student’s emotional behavioral problems
Procedia PDF Downloads 16928643 Non Linear Stability of Non Newtonian Thin Liquid Film Flowing down an Incline
Authors: Lamia Bourdache, Amar Djema
Abstract:
The effect of non-Newtonian property (power law index n) on traveling waves of thin layer of power law fluid flowing over an inclined plane is investigated. For this, a simplified second-order two-equation model (SM) is used. The complete model is second-order four-equation (CM). It is derived by combining the weighted residual integral method and the lubrication theory. This is due to the fact that at the beginning of the instability waves, a very small number of waves is observed. Using a suitable set of test functions, second order terms are eliminated from the calculus so that the model is still accurate to the second order approximation. Linear, spatial, and temporal stabilities are studied. For travelling waves, a particular type of wave form that is steady in a moving frame, i.e., that travels at a constant celerity without changing its shape is studied. This type of solutions which are characterized by their celerity exists under suitable conditions, when the widening due to dispersion is balanced exactly by the narrowing effect due to the nonlinearity. Changing the parameter of celerity in some range allows exploring the entire spectrum of asymptotic behavior of these traveling waves. The (SM) model is converted into a three dimensional dynamical system. The result is that the model exhibits bifurcation scenarios such as heteroclinic, homoclinic, Hopf, and period-doubling bifurcations for different values of the power law index n. The influence of the non-Newtonian parameter on the nonlinear development of these travelling waves is discussed. It is found at the end that the qualitative characters of bifurcation scenarios are insensitive to the variation of the power law index.Keywords: inclined plane, nonlinear stability, non-Newtonian, thin film
Procedia PDF Downloads 28428642 Effect of Slip Condition and Magnetic Field on Unsteady MHD Thin Film Flow of a Third Grade Fluid with Heat Transfer down an Inclined Plane
Authors: Y. M. Aiyesimi, G. T. Okedayo, O. W. Lawal
Abstract:
The analysis has been carried out to study unsteady MHD thin film flow of a third grade fluid down an inclined plane with heat transfer when the slippage between the surface of plane and the lower surface of the fluid is valid. The governing nonlinear partial differential equations involved are reduced to linear partial differential equations using regular perturbation method. The resulting equations were solved analytically using method of separation of variable and eigenfunctions expansion. The solutions obtained were examined and discussed graphically. It is interesting to find that the variation of the velocity and temperature profile with the slip and magnetic field parameter depends on time.Keywords: non-Newtonian fluid, MHD flow, thin film flow, third grade fluid, slip boundary condition, heat transfer, separation of variable, eigenfunction expansion
Procedia PDF Downloads 38528641 A Linear Regression Model for Estimating Anxiety Index Using Wide Area Frontal Lobe Brain Blood Volume
Authors: Takashi Kaburagi, Masashi Takenaka, Yosuke Kurihara, Takashi Matsumoto
Abstract:
Major depressive disorder (MDD) is one of the most common mental illnesses today. It is believed to be caused by a combination of several factors, including stress. Stress can be quantitatively evaluated using the State-Trait Anxiety Inventory (STAI), one of the best indices to evaluate anxiety. Although STAI scores are widely used in applications ranging from clinical diagnosis to basic research, the scores are calculated based on a self-reported questionnaire. An objective evaluation is required because the subject may intentionally change his/her answers if multiple tests are carried out. In this article, we present a modified index called the “multi-channel Laterality Index at Rest (mc-LIR)” by recording the brain activity from a wider area of the frontal lobe using multi-channel functional near-infrared spectroscopy (fNIRS). The presented index aims to measure multiple positions near the Fpz defined by the international 10-20 system positioning. Using 24 subjects, the dependencies on the number of measuring points used to calculate the mc-LIR and its correlation coefficients with the STAI scores are reported. Furthermore, a simple linear regression was performed to estimate the STAI scores from mc-LIR. The cross-validation error is also reported. The experimental results show that using multiple positions near the Fpz will improve the correlation coefficients and estimation than those using only two positions.Keywords: frontal lobe, functional near-infrared spectroscopy, state-trait anxiety inventory score, stress
Procedia PDF Downloads 25128640 Bayesian Locally Approach for Spatial Modeling of Visceral Leishmaniasis Infection in Northern and Central Tunisia
Authors: Kais Ben-Ahmed, Mhamed Ali-El-Aroui
Abstract:
This paper develops a Local Generalized Linear Spatial Model (LGLSM) to describe the spatial variation of Visceral Leishmaniasis (VL) infection risk in northern and central Tunisia. The response from each region is a number of affected children less than five years of age recorded from 1996 through 2006 from Tunisian pediatric departments and treated as a poison county level data. The model includes climatic factors, namely averages of annual rainfall, extreme values of low temperatures in winter and high temperatures in summer to characterize the climate of each region according to each continentality index, the pluviometric quotient of Emberger (Q2) to characterize bioclimatic regions and component for residual extra-poison variation. The statistical results show the progressive increase in the number of affected children in regions with high continentality index and low mean yearly rainfull. On the other hand, an increase in pluviometric quotient of Emberger contributed to a significant increase in VL incidence rate. When compared with the original GLSM, Bayesian locally modeling is improvement and gives a better approximation of the Tunisian VL risk estimation. According to the Bayesian approach inference, we use vague priors for all parameters model and Markov Chain Monte Carlo method.Keywords: generalized linear spatial model, local model, extra-poisson variation, continentality index, visceral leishmaniasis, Tunisia
Procedia PDF Downloads 39828639 Automatic Classification of the Stand-to-Sit Phase in the TUG Test Using Machine Learning
Authors: Yasmine Abu Adla, Racha Soubra, Milana Kasab, Mohamad O. Diab, Aly Chkeir
Abstract:
Over the past several years, researchers have shown a great interest in assessing the mobility of elderly people to measure their functional status. Usually, such an assessment is done by conducting tests that require the subject to walk a certain distance, turn around, and finally sit back down. Consequently, this study aims to provide an at home monitoring system to assess the patient’s status continuously. Thus, we proposed a technique to automatically detect when a subject sits down while walking at home. In this study, we utilized a Doppler radar system to capture the motion of the subjects. More than 20 features were extracted from the radar signals, out of which 11 were chosen based on their intraclass correlation coefficient (ICC > 0.75). Accordingly, the sequential floating forward selection wrapper was applied to further narrow down the final feature vector. Finally, 5 features were introduced to the linear discriminant analysis classifier, and an accuracy of 93.75% was achieved as well as a precision and recall of 95% and 90%, respectively.Keywords: Doppler radar system, stand-to-sit phase, TUG test, machine learning, classification
Procedia PDF Downloads 16128638 Portfolio Optimization with Reward-Risk Ratio Measure Based on the Mean Absolute Deviation
Authors: Wlodzimierz Ogryczak, Michal Przyluski, Tomasz Sliwinski
Abstract:
In problems of portfolio selection, the reward-risk ratio criterion is optimized to search for a risky portfolio with the maximum increase of the mean return in proportion to the risk measure increase when compared to the risk-free investments. In the classical model, following Markowitz, the risk is measured by the variance thus representing the Sharpe ratio optimization and leading to the quadratic optimization problems. Several Linear Programming (LP) computable risk measures have been introduced and applied in portfolio optimization. In particular, the Mean Absolute Deviation (MAD) measure has been widely recognized. The reward-risk ratio optimization with the MAD measure can be transformed into the LP formulation with the number of constraints proportional to the number of scenarios and the number of variables proportional to the total of the number of scenarios and the number of instruments. This may lead to the LP models with huge number of variables and constraints in the case of real-life financial decisions based on several thousands scenarios, thus decreasing their computational efficiency and making them hardly solvable by general LP tools. We show that the computational efficiency can be then dramatically improved by an alternative model based on the inverse risk-reward ratio minimization and by taking advantages of the LP duality. In the introduced LP model the number of structural constraints is proportional to the number of instruments thus not affecting seriously the simplex method efficiency by the number of scenarios and therefore guaranteeing easy solvability. Moreover, we show that under natural restriction on the target value the MAD risk-reward ratio optimization is consistent with the second order stochastic dominance rules.Keywords: portfolio optimization, reward-risk ratio, mean absolute deviation, linear programming
Procedia PDF Downloads 40828637 Passive Non-Prehensile Manipulation on Helix Path Based on Mechanical Intelligence
Authors: Abdullah Bajelan, Adel Akbarimajd
Abstract:
Object manipulation techniques in robotics can be categorized in two major groups including manipulation with grasp and manipulation without grasp. The original aim of this paper is to develop an object manipulation method where in addition to being grasp-less, the manipulation task is done in a passive approach. In this method, linear and angular positions of the object are changed and its manipulation path is controlled. The manipulation path is a helix track with constant radius and incline. The method presented in this paper proposes a system which has not the actuator and the active controller. So this system requires a passive mechanical intelligence to convey the object from the status of the source along the specified path to the goal state. This intelligent is created based on utilizing the geometry of the system components. A general set up for the components of the system is considered to satisfy the required conditions. Then after kinematical analysis, detailed dimensions and geometry of the mechanism is obtained. The kinematical results are verified by simulation in ADAMS.Keywords: mechanical intelligence, object manipulation, passive mechanism, passive non-prehensile manipulation
Procedia PDF Downloads 48228636 Diet and Exercise Intervention and Bio–Atherogenic Markers for Obesity Classes of Black South Africans with Type 2 Diabetes Mellitus Using Discriminant Analysis
Authors: Oladele V. Adeniyi, B. Longo-Mbenza, Daniel T. Goon
Abstract:
Background: Lipids are often low or in the normal ranges and controversial in the atherogenesis among Black Africans. The effect of the severity of obesity on some traditional and novel cardiovascular disease risk factors is unclear before and after a diet and exercise maintenance programme among obese black South Africans with type 2 diabetes mellitus (T2DM). Therefore, this study aimed to identify the risk factors to discriminate obesity classes among patients with T2DM before and after a diet and exercise programme. Methods: This interventional cohort of Black South Africans with T2DM was followed by a very – low calorie diet and exercise programme in Mthatha, between August and November 2013. Gender, age, and the levels of body mass index (BMI), blood pressure, monthly income, daily frequency of meals, blood random plasma glucose (RPG), serum creatinine, total cholesterol (TC), triglycerides (TG), LDL –C, HDL – C, Non-HDL, ratios of TC/HDL, TG/HDL, and LDL/HDL were recorded. Univariate analysis (ANOVA) and multivariate discriminant analysis were performed to separate obesity classes: normal weight (BMI = 18.5 – 24.9 kg/m2), overweight (BMI = 25 – 29.9 kg/m2), obesity Class 1 (BMI = 30 – 34.9 kg/m2), obesity Class 2 (BMI = 35 – 39.9 kg/m2), and obesity Class 3 (BMI ≥ 40 kg/m2). Results: At the baseline (1st Month September), all 327 patients were overweight/obese: 19.6% overweight, 42.8% obese class 1, 22.3% obese class 2, and 15.3% obese class 3. In discriminant analysis, only systolic blood pressure (SBP with positive association) and LDL/HDL ratio (negative association) significantly separated increasing obesity classes. At the post – evaluation (3rd Month November), out of all 327 patients, 19.9%, 19.3%, 37.6%, 15%, and 8.3% had normal weight, overweight, obesity class 1, obesity class 2, and obesity class 3, respectively. There was a significant negative association between serum creatinine and increase in BMI. In discriminant analysis, only age (positive association), SBP (U – shaped relationship), monthly income (inverted U – shaped association), daily frequency of meals (positive association), and LDL/HDL ratio (positive association) classified significantly increasing obesity classes. Conclusion: There is an epidemic of diabesity (Obesity + T2DM) in this Black South Africans with some weight loss. Further studies are needed to understand positive or negative linear correlations and paradoxical curvilinear correlations between these markers and increase in BMI among black South African T2DM patients.Keywords: atherogenic dyslipidaemia, dietary interventions, obesity, south africans
Procedia PDF Downloads 37028635 Amperometric Biosensor for Glucose Determination Based on a Recombinant Mn Peroxidase from Corn Cross-linked to a Gold Electrode
Authors: Anahita Izadyar, My Ni Van, Kayleigh Amber Rodriguez, Ilwoo Seok, Elizabeth E. Hood
Abstract:
Using a recombinant enzyme derived from corn and a simple modification, we fabricated a facile, fast, and cost-beneficial biosensor to measure glucose. The Nafion/ Plant Produced Mn Peroxidase (PPMP)– glucose oxidase (GOx)- Bovine serum albumin (BSA) /Au electrode showed an excellent amperometric response to detect glucose. This biosensor is capable of responding to a wide range of glucose—20.0 µM−15.0 mM and has a lower detection limit (LOD) of 2.90µM. The reproducibility response using six electrodes is also very substantial and indicates the high capability of this biosensor to detect a wide range of 3.10±0.19µM to 13.2±1.8 mM glucose concentration. Selectivity of this electrode was investigated in an optimized experimental solution contains 10% diet green tea with citrus containing ascorbic acid (AA), and citric acid (CA) in a wide concentration of glucose at 0.02 to 14.0mM with an LOD of 3.10µM. Reproducibility was also investigated using 4 electrodes in this sample and shows notable results in the wide concentration range of 3.35±0.45µM to of 13.0 ± 0.81 mM. We also used other voltammetry methods to evaluate this biosensor. We applied linear sweep voltammetry (LSV) and this technique shows a wide range of 0.10−15.0 mM to detect glucose with a lower detection limit of 19.5µM. The performance and strength of this enzyme biosensor were the simplicity, wide linear ranges, sensitivities, selectivity, and low limits of detection. We expect that the modified biosensor has the potential for monitoring various biofluids.Keywords: plant-produced manganese peroxidase, enzyme-based biosensors, glucose, modified gold electrode, glucose oxidase
Procedia PDF Downloads 14128634 Analysis of Cell Cycle Status in Radiation Non-Targeted Hepatoma Cells Using Flow Cytometry: Evidence of Dose Dependent Response
Authors: Sharmi Mukherjee, Anindita Chakraborty
Abstract:
Cellular irradiation incites complex responses including arrest of cell cycle progression. This article accentuates the effects of radiation on cell cycle status of radiation non-targeted cells. Human Hepatoma HepG2 cells were exposed to increasing doses of γ radiations (1, 2, 4, 6 Gy) and their cell culture media was transferred to non-targeted HepG2 cells cultured in other Petri plates. These radiation non-targeted cells cultured in the ICCM (Irradiated cell conditioned media) were the bystander cells on which cell cycle analysis was performed using flow cytometry. An apparent decrease in the distribution of bystander cells at G0/G1 phase was observed with increased radiation doses upto 4 Gy representing a linear relationship. This was accompanied by a gradual increase in cellular distribution at G2/M phase. Interestingly the number of cells in G2/M phase at 1 and 2 Gy irradiation was not significantly different from each other. However, the percentage of G2 phase cells at 4 and 6 Gy doses were significantly higher than 2 Gy dose indicating the IC50 dose to be between 2 and 4 Gy. Cell cycle arrest is an indirect indicator of genotoxic damage in cells. In this study, bystander stress signals through the cell culture media of irradiated cells disseminated the radiation induced DNA damages in the non-targeted cells which resulted in arrest of the cell cycle progression at G2/M phase checkpoint. This implies that actual radiation biological effects represent a penumbra with effects encompassing a larger area than the actual beam. This article highlights the existence of genotoxic damages as bystander effects of γ rays in human Hepatoma cells by cell cycle analysis and opens up avenues for appraisal of bystander stress communications between tumor cells. Contemplation of underlying signaling mechanisms can be manipulated to maximize damaging effects of radiation with minimum dose and thus has therapeutic applications.Keywords: bystander effect, cell cycle, genotoxic damage, hepatoma
Procedia PDF Downloads 18428633 The Impact of Hospital Strikes on Patient Care: Evidence from 135 Strikes in the Portuguese National Health System
Authors: Eduardo Costa
Abstract:
Hospital strikes in the Portuguese National Health Service (NHS) are becoming increasingly frequent, raising concerns in what respects patient safety. In fact, data shows that mortality rates for patients admitted during strikes are up to 30% higher than for patients admitted in other days. This paper analyses the effects of hospital strikes on patients’ outcomes. Specifically, it analyzes the impact of different strikes (physicians, nurses and other health professionals), on in-hospital mortality rates, readmission rates and length of stay. The paper uses patient-level data containing all NHS hospital admissions in mainland Portugal from 2012 to 2017, together with a comprehensive strike dataset comprising over 250 strike days (19 physicians-strike days, 150 nurses-strike days and 50 other health professionals-strike days) from 135 different strikes. The paper uses a linear probability model and controls for hospital and regional characteristics, time trends, and changes in patients’ composition and diagnoses. Preliminary results suggest a 6-7% increase in in-hospital mortality rates for patients exposed to physicians’ strikes. The effect is smaller for patients exposed to nurses’ strikes (2-5%). Patients exposed to nurses strikes during their stay have, on average, higher 30-days urgent readmission rates (4%). Length of stay also seems to increase for patients exposed to any strike. Results – conditional on further testing, namely on non-linear models - suggest that hospital operations and service levels are partially disrupted during strikes.Keywords: health sector strikes, in-hospital mortality rate, length of stay, readmission rate
Procedia PDF Downloads 13528632 Rayleigh-Bénard-Taylor Convection of Newtonian Nanoliquid
Authors: P. G. Siddheshwar, T. N. Sakshath
Abstract:
In the paper we make linear and non-linear stability analyses of Rayleigh-Bénard convection of a Newtonian nanoliquid in a rotating medium (called as Rayleigh-Bénard-Taylor convection). Rigid-rigid isothermal boundaries are considered for investigation. Khanafer-Vafai-Lightstone single phase model is used for studying instabilities in nanoliquids. Various thermophysical properties of nanoliquid are obtained using phenomenological laws and mixture theory. The eigen boundary value problem is solved for the Rayleigh number using an analytical method by considering trigonometric eigen functions. We observe that the critical nanoliquid Rayleigh number is less than that of the base liquid. Thus the onset of convection is advanced due to the addition of nanoparticles. So, increase in volume fraction leads to advanced onset and thereby increase in heat transport. The amplitudes of convective modes required for estimating the heat transport are determined analytically. The tri-modal standard Lorenz model is derived for the steady state assuming small scale convective motions. The effect of rotation on the onset of convection and on heat transport is investigated and depicted graphically. It is observed that the onset of convection is delayed due to rotation and hence leads to decrease in heat transport. Hence, rotation has a stabilizing effect on the system. This is due to the fact that the energy of the system is used to create the component V. We observe that the amount of heat transport is less in the case of rigid-rigid isothermal boundaries compared to free-free isothermal boundaries.Keywords: nanoliquid, rigid-rigid, rotation, single phase
Procedia PDF Downloads 23628631 Assessment of Landfill Pollution Load on Hydroecosystem by Use of Heavy Metal Bioaccumulation Data in Fish
Authors: Gintarė Sauliutė, Gintaras Svecevičius
Abstract:
Landfill leachates contain a number of persistent pollutants, including heavy metals. They have the ability to spread in ecosystems and accumulate in fish which most of them are classified as top-consumers of trophic chains. Fish are freely swimming organisms; but perhaps, due to their species-specific ecological and behavioral properties, they often prefer the most suitable biotopes and therefore, did not avoid harmful substances or environments. That is why it is necessary to evaluate the persistent pollutant dispersion in hydroecosystem using fish tissue metal concentration. In hydroecosystems of hybrid type (e.g. river-pond-river) the distance from the pollution source could be a perfect indicator of such a kind of metal distribution. The studies were carried out in the Kairiai landfill neighboring hybrid-type ecosystem which is located 5 km east of the Šiauliai City. Fish tissue (gills, liver, and muscle) metal concentration measurements were performed on two types of ecologically-different fishes according to their feeding characteristics: benthophagous (Gibel carp, roach) and predatory (Northern pike, perch). A number of mathematical models (linear, non-linear, using log and other transformations) have been applied in order to identify the most satisfactorily description of the interdependence between fish tissue metal concentration and the distance from the pollution source. However, the only one log-multiple regression model revealed the pattern that the distance from the pollution source is closely and positively correlated with metal concentration in all predatory fish tissues studied (gills, liver, and muscle).Keywords: bioaccumulation in fish, heavy metals, hydroecosystem, landfill leachate, mathematical model
Procedia PDF Downloads 28828630 Hybrid Model: An Integration of Machine Learning with Traditional Scorecards
Authors: Golnush Masghati-Amoli, Paul Chin
Abstract:
Over the past recent years, with the rapid increases in data availability and computing power, Machine Learning (ML) techniques have been called on in a range of different industries for their strong predictive capability. However, the use of Machine Learning in commercial banking has been limited due to a special challenge imposed by numerous regulations that require lenders to be able to explain their analytic models, not only to regulators but often to consumers. In other words, although Machine Leaning techniques enable better prediction with a higher level of accuracy, in comparison with other industries, they are adopted less frequently in commercial banking especially for scoring purposes. This is due to the fact that Machine Learning techniques are often considered as a black box and fail to provide information on why a certain risk score is given to a customer. In order to bridge this gap between the explain-ability and performance of Machine Learning techniques, a Hybrid Model is developed at Dun and Bradstreet that is focused on blending Machine Learning algorithms with traditional approaches such as scorecards. The Hybrid Model maximizes efficiency of traditional scorecards by merging its practical benefits, such as explain-ability and the ability to input domain knowledge, with the deep insights of Machine Learning techniques which can uncover patterns scorecard approaches cannot. First, through development of Machine Learning models, engineered features and latent variables and feature interactions that demonstrate high information value in the prediction of customer risk are identified. Then, these features are employed to introduce observed non-linear relationships between the explanatory and dependent variables into traditional scorecards. Moreover, instead of directly computing the Weight of Evidence (WoE) from good and bad data points, the Hybrid Model tries to match the score distribution generated by a Machine Learning algorithm, which ends up providing an estimate of the WoE for each bin. This capability helps to build powerful scorecards with sparse cases that cannot be achieved with traditional approaches. The proposed Hybrid Model is tested on different portfolios where a significant gap is observed between the performance of traditional scorecards and Machine Learning models. The result of analysis shows that Hybrid Model can improve the performance of traditional scorecards by introducing non-linear relationships between explanatory and target variables from Machine Learning models into traditional scorecards. Also, it is observed that in some scenarios the Hybrid Model can be almost as predictive as the Machine Learning techniques while being as transparent as traditional scorecards. Therefore, it is concluded that, with the use of Hybrid Model, Machine Learning algorithms can be used in the commercial banking industry without being concerned with difficulties in explaining the models for regulatory purposes.Keywords: machine learning algorithms, scorecard, commercial banking, consumer risk, feature engineering
Procedia PDF Downloads 13628629 Methodological Proposal, Archival Thesaurus in Colombian Sign Language
Authors: Pedro A. Medina-Rios, Marly Yolie Quintana-Daza
Abstract:
Having the opportunity to communicate in a social, academic and work context is very relevant for any individual and more for a deaf person when oral language is not their natural language, and written language is their second language. Currently, in Colombia, there is not a specialized dictionary for our best knowledge in sign language archiving. Archival is one of the areas that the deaf community has a greater chance of performing. Nourishing new signs in dictionaries for deaf people extends the possibility that they have the appropriate signs to communicate and improve their performance. The aim of this work was to illustrate the importance of designing pedagogical and technological strategies of knowledge management, for the academic inclusion of deaf people through proposals of lexicon in Colombian sign language (LSC) in the area of archival. As a method, the analytical study was used to identify relevant words in the technical area of the archival and its counterpart with the LSC, 30 deaf people, apprentices - students of the Servicio Nacional de Aprendizaje (SENA) in Documentary or Archival Management programs, were evaluated through direct interviews in LSC. For the analysis tools were maintained to evaluate correlation patterns and linguistic methods of visual, gestural analysis and corpus; besides, methods of linear regression were used. Among the results, significant data were found among the variables socioeconomic stratum, academic level, labor location. The need to generate new signals on the subject of the file to improve communication between the deaf person, listener and the sign language interpreter. It is concluded that the generation of new signs to nourish the LSC dictionary in archival subjects is necessary to improve the labor inclusion of deaf people in Colombia.Keywords: archival, inclusion, deaf, thesaurus
Procedia PDF Downloads 27928628 Pooled Analysis of Three School-Based Obesity Interventions in a Metropolitan Area of Brazil
Authors: Rosely Sichieri, Bruna K. Hassan, Michele Sgambato, Barbara S. N. Souza, Rosangela A. Pereira, Edna M. Yokoo, Diana B. Cunha
Abstract:
Obesity is increasing at a fast rate in low and middle-income countries where few school-based obesity interventions have been conducted. Results of obesity prevention studies are still inconclusive mainly due to underestimation of sample size in cluster-randomized trials and overestimation of changes in body mass index (BMI). The pooled analysis in the present study overcomes these design problems by analyzing 4,448 students (mean age 11.7 years) from three randomized behavioral school-based interventions, conducted in public schools of the metropolitan area of Rio de Janeiro, Brazil. The three studies focused on encouraging students to change their drinking and eating habits over one school year, with monthly 1-h sessions in the classroom. Folders explaining the intervention program and suggesting the participation of the family, such as reducing the purchase of sodas were sent home. Classroom activities were delivered by research assistants in the first two interventions and by the regular teachers in the third one, except for culinary class aimed at developing cooking skills to increase healthy eating choices. The first intervention was conducted in 2005 with 1,140 fourth graders from 22 public schools; the second, with 644 fifth graders from 20 public schools in 2010; and the last one, with 2,743 fifth and sixth graders from 18 public schools in 2016. The result was a non-significant change in BMI after one school year of positive changes in dietary behaviors associated with obesity. Pooled intention-to-treat analysis using linear mixed models was used for the overall and subgroup analysis by BMI status, sex, and race. The estimated mean BMI changes were from 18.93 to 19.22 in the control group and from 18.89 to 19.19 in the intervention group; with a p-value of change over time of 0.94. Control and intervention groups were balanced at baseline. Subgroup analyses were statistically and clinically non-significant, except for the non-overweight/obese group with a 0.05 reduction of BMI comparing the intervention with control. In conclusion, this large pooled analysis showed a very small effect on BMI only in the normal weight students. The results are in line with many of the school-based initiatives that have been promising in relation to modifying behaviors associated with obesity but of no impact on excessive weight gain. Changes in BMI may require great changes in energy balance that are hard to achieve in primary prevention at school level.Keywords: adolescents, obesity prevention, randomized controlled trials, school-based study
Procedia PDF Downloads 16128627 Improving Taint Analysis of Android Applications Using Finite State Machines
Authors: Assad Maalouf, Lunjin Lu, James Lynott
Abstract:
We present a taint analysis that can automatically detect when string operations result in a string that is free of taints, where all the tainted patterns have been removed. This is an improvement on the conservative behavior of previous taint analyzers, where a string operation on a tainted string always leads to a tainted string unless the operation is manually marked as a sanitizer. The taint analysis is built on top of a string analysis that uses finite state automata to approximate the sets of values that string variables can take during the execution of a program. The proposed approach has been implemented as an extension of FlowDroid and experimental results show that the resulting taint analyzer is much more precise than the original FlowDroid.Keywords: android, static analysis, string analysis, taint analysis
Procedia PDF Downloads 18228626 Vibration Analysis of Magnetostrictive Nano-Plate by Using Modified Couple Stress and Nonlocal Elasticity Theories
Authors: Hamed Khani Arani, Mohammad Shariyat, Armaghan Mohammadian
Abstract:
In the present study, the free vibration of magnetostrictive nano-plate (MsNP) resting on the Pasternak foundation is investigated. Firstly, the modified couple stress (MCS) and nonlocal elasticity theories are compared together and taken into account to consider the small scale effects; in this paper not only two theories are analyzed but also it improves the MCS theory is more accurate than nonlocal elasticity theory in such problems. A feedback control system is utilized to investigate the effects of a magnetic field. First-order shear deformation theory (FSDT), Hamilton’s principle and energy method are utilized in order to drive the equations of motion and these equations are solved by differential quadrature method (DQM) for simply supported boundary conditions. The MsNP undergoes in-plane forces in x and y directions. In this regard, the dimensionless frequency is plotted to study the effects of small scale parameter, magnetic field, aspect ratio, thickness ratio and compression and tension loads. Results indicate that these parameters play a key role on the natural frequency. According to the above results, MsNP can be used in the communications equipment, smart control vibration of nanostructure especially in sensor and actuators such as wireless linear micro motor and smart nano valves in injectors.Keywords: feedback control system, magnetostrictive nano-plate, modified couple stress theory, nonlocal elasticity theory, vibration analysis
Procedia PDF Downloads 13628625 Quantification of Effect of Linear Anionic Polyacrylamide on Seepage in Irrigation Channels
Authors: Hamil Uribe, Cristian Arancibia
Abstract:
In Chile, the water for irrigation and hydropower generation is delivery essentially through unlined channels on earth, which have high seepage losses. Traditional seepage-abatement technologies are very expensive. The goals of this work were to quantify water loss in unlined channels and select reaches to evaluate the use of linear anionic polyacrylamide (LA-PAM) to reduce seepage losses. The study was carried out in Maule Region, central area of Chile. Water users indicated reaches with potential seepage losses, 45 km of channels in total, whose flow varied between 1.07 and 23.6 m³ s⁻¹. According to seepage measurements, 4 reaches of channels, 4.5 km in total, were selected for LA-PAM application. One to 4 LA-PAM applications were performed at rates of 11 kg ha⁻¹, considering wet perimeter area as basis of calculation. Large channels were used to allow motorboat moving against the current to carry-out LA-PAM application. For applications, a seeder machine was used to evenly distribute granulated polymer on water surface. Water flow was measured (StreamPro ADCP) upstream and downstream in selected reaches, to estimate seepage losses before and after LA-PAM application. Weekly measurements were made to quantify treatment effect and duration. In each case, water turbidity and temperature were measured. Channels showed variable losses up to 13.5%. Channels showing water gains were not treated with PAM. In all cases, LA-PAM effect was positive, achieving average loss reductions of 8% to 3.1%. Water loss was confirmed and it was possible to reduce seepage through LA-PAM applications provided that losses were known and correctly determined when applying the polymer. This could allow increasing irrigation security in critical periods, especially under drought conditions.Keywords: canal seepage, irrigation, polyacrylamide, water management
Procedia PDF Downloads 17628624 A Reliable Multi-Type Vehicle Classification System
Authors: Ghada S. Moussa
Abstract:
Vehicle classification is an important task in traffic surveillance and intelligent transportation systems. Classification of vehicle images is facing several problems such as: high intra-class vehicle variations, occlusion, shadow, illumination. These problems and others must be considered to develop a reliable vehicle classification system. In this study, a reliable multi-type vehicle classification system based on Bag-of-Words (BoW) paradigm is developed. Our proposed system used and compared four well-known classifiers; Linear Discriminant Analysis (LDA), Support Vector Machine (SVM), k-Nearest Neighbour (KNN), and Decision Tree to classify vehicles into four categories: motorcycles, small, medium and large. Experiments on a large dataset show that our approach is efficient and reliable in classifying vehicles with accuracy of 95.7%. The SVM outperforms other classification algorithms in terms of both accuracy and robustness alongside considerable reduction in execution time. The innovativeness of developed system is it can serve as a framework for many vehicle classification systems.Keywords: vehicle classification, bag-of-words technique, SVM classifier, LDA classifier, KNN classifier, decision tree classifier, SIFT algorithm
Procedia PDF Downloads 35928623 Preceramic Polymers Formulations for Potential Additive Manufacturing
Authors: Saja M. Nabat Al-Ajrash, Charles Browning, Rose Eckerle, Li Cao
Abstract:
Three preceramic polymer formulations for potential use in 3D printing technologies were investigated. The polymeric precursors include an allyl hydrido polycarbosilane (SMP-10), SMP-10/1,6-dexanediol diacrylate (HDDA) mixture, and polydimethylsiloxane (PDMS). The rheological property of the polymeric precursors, including the viscosity within a wide shear rate range was compared to determine the applicability in additive manufacturing technology. The structural properties of the polymeric solutions and their photocureability were investigated using Fourier transform infrared spectroscopy (FTIR) and differential scanning calorimetry (DSC). Moreover, thermogravimetric analysis (TGA) and X-ray diffraction (XRD) were utilized to study polymeric to ceramic conversion for versatile precursors. The prepared precursor resin proved to have outstanding photo-curing properties and the ability to transform to the silicon carbide phase at temperatures as low as 850 °C. The obtained ceramic was fully dense with nearly linear shrinkage and a shiny, smooth surface after pyrolysis. Furthermore, after pyrolysis to 1350 °C and TGA analysis, PDMS polymer showed the highest onset decomposition temperature and the lowest retained weight (52 wt%), while SMP.10/HDDA showed the lowest onset temperature and ceramic yield (71.7 wt%). In terms of crystallography, the ceramic matrix composite appeared to have three coexisting phases, including silicon carbide, and silicon oxycarbide. The results are very promising to fabricate ceramic materials working at high temperatures with complex geometries.Keywords: preceramic polymer, silicon carbide, photocuring, allyl hydrido polycarbosilane, SMP-10
Procedia PDF Downloads 12628622 The Stock Price Effect of Apple Keynotes
Authors: Ethan Petersen
Abstract:
In this paper, we analyze the volatility of Apple’s stock beginning January 3, 2005 up to October 9, 2014, then focus on a range from 30 days prior to each product announcement until 30 days after. Product announcements are filtered; announcements whose 60 day range is devoid of other events are separated. This filtration is chosen to isolate, and study, a potential cross-effect. Concerning Apple keynotes, there are two significant dates: the day the invitations to the event are received and the day of the event itself. As such, the statistical analysis is conducted for both invite-centered and event-centered time frames. A comparison to the VIX is made to determine if the trend is simply following the market or deviating. Regardless of the filtration, we find that there is a clear deviation from the market. Comparing these data sets, there are significantly different trends: isolated events have a constantly decreasing, erratic trend in volatility but an increasing, linear trend is observed for clustered events. According to the Efficient Market Hypothesis, we would expect a change when new information is publicly known and the results of this study support this claim.Keywords: efficient market hypothesis, event study, volatility, VIX
Procedia PDF Downloads 28028621 Evaluation of Newly Synthesized Steroid Derivatives Using In silico Molecular Descriptors and Chemometric Techniques
Authors: Milica Ž. Karadžić, Lidija R. Jevrić, Sanja Podunavac-Kuzmanović, Strahinja Z. Kovačević, Anamarija I. Mandić, Katarina Penov-Gaši, Andrea R. Nikolić, Aleksandar M. Oklješa
Abstract:
This study considered selection of the in silico molecular descriptors and the models for newly synthesized steroid derivatives description and their characterization using chemometric techniques. Multiple linear regression (MLR) models were established and gave the best molecular descriptors for quantitative structure-retention relationship (QSRR) modeling of the retention of the investigated molecules. MLR models were without multicollinearity among the selected molecular descriptors according to the variance inflation factor (VIF) values. Used molecular descriptors were ranked using generalized pair correlation method (GPCM). In this method, the significant difference between independent variables can be noticed regardless almost equal correlation between dependent variable. Generated MLR models were statistically and cross-validated and the best models were kept. Models were ranked using sum of ranking differences (SRD) method. According to this method, the most consistent QSRR model can be found and similarity or dissimilarity between the models could be noticed. In this study, SRD was performed using average values of experimentally observed data as a golden standard. Chemometric analysis was conducted in order to characterize newly synthesized steroid derivatives for further investigation regarding their potential biological activity and further synthesis. This article is based upon work from COST Action (CM1105), supported by COST (European Cooperation in Science and Technology).Keywords: generalized pair correlation method, molecular descriptors, regression analysis, steroids, sum of ranking differences
Procedia PDF Downloads 34828620 Residential Satisfaction and Public Perception of Socialized Housing Projects in Davao City, Philippines
Authors: Micah Amor P. Yares
Abstract:
Aside from the provision of adequate housing, the Philippine government faces the challenge of ensuring that the housing units provided conform to the Filipino’s ambition to self as manifested by owning a small house on a big lot. The study aimed to explore the levels of satisfaction of end-users and the public perception towards socialized housing in Davao City, Philippines. The residential satisfaction survey includes three types of respondents, which are end-users of single-detached, duplex and rowhouse socialized housing units. Respondents were asked to rate their level of satisfaction and perception to the following housing components: Dwelling Unit; Public Facilities; Social Environment; Neighborhood Facilities; Management Systems; and Acquisition and Financing. The data were subjected to Exploratory Factor Analysis to determine if variables can be grouped together, and Confirmatory Factor Analysis to measure if the model fits the construct. In determining which component affects the level of perception and satisfaction, a Multiple Linear Regression Analysis was employed. Lastly, an Individual Samples T-Test was performed to compare the levels of satisfaction and perception among respondents. Results revealed that residents of socialized housing were highly satisfied with their living conditions despite concerns on management systems, public and neighborhood facilities. Residents' satisfaction is primarily influenced by the Social Environment, Acquisition and Financing, and the Dwelling Unit. However, a significant difference in residential satisfaction level was observed among different types of housing with rowhouse residents recording the lowest satisfaction level compared to single-detached and duplex units. Moreover, the general public perceived Socialized housing as moderately satisfactory having the same determinant as the end-users aside from the Public Facilities. This study recommends revisiting the current Socialized Housing policies by considering the feedback from the end-users based on their lived experience and the public according to their perception.Keywords: public perception, residential satisfaction, rowhouse, socialized housing
Procedia PDF Downloads 24028619 Optimal Image Representation for Linear Canonical Transform Multiplexing
Authors: Navdeep Goel, Salvador Gabarda
Abstract:
Digital images are widely used in computer applications. To store or transmit the uncompressed images requires considerable storage capacity and transmission bandwidth. Image compression is a means to perform transmission or storage of visual data in the most economical way. This paper explains about how images can be encoded to be transmitted in a multiplexing time-frequency domain channel. Multiplexing involves packing signals together whose representations are compact in the working domain. In order to optimize transmission resources each 4x4 pixel block of the image is transformed by a suitable polynomial approximation, into a minimal number of coefficients. Less than 4*4 coefficients in one block spares a significant amount of transmitted information, but some information is lost. Different approximations for image transformation have been evaluated as polynomial representation (Vandermonde matrix), least squares + gradient descent, 1-D Chebyshev polynomials, 2-D Chebyshev polynomials or singular value decomposition (SVD). Results have been compared in terms of nominal compression rate (NCR), compression ratio (CR) and peak signal-to-noise ratio (PSNR) in order to minimize the error function defined as the difference between the original pixel gray levels and the approximated polynomial output. Polynomial coefficients have been later encoded and handled for generating chirps in a target rate of about two chirps per 4*4 pixel block and then submitted to a transmission multiplexing operation in the time-frequency domain.Keywords: chirp signals, image multiplexing, image transformation, linear canonical transform, polynomial approximation
Procedia PDF Downloads 41428618 The Scientific Study of the Relationship Between Physicochemical and Microstructural Properties of Ultrafiltered Cheese: Protein Modification and Membrane Separation
Authors: Shahram Naghizadeh Raeisi, Ali Alghooneh
Abstract:
The loss of curd cohesiveness and syneresis are two common problems in the ultrafiltered cheese industry. In this study, by using membrane technology and protein modification, a modified cheese was developed and its properties were compared with a control sample. In order to decrease the lactose content and adjust the protein, acidity, dry matter and milk minerals, a combination of ultrafiltration, nanofiltration and reverse osmosis technologies was employed. For protein modification, a two-stage chemical and enzymatic reaction was employed before and after ultrafiltration. The physicochemical and microstructural properties of the modified ultrafiltered cheese were compared with the control one. Results showed that the modified protein enhanced the functional properties of the final cheese significantly (pvalue< 0.05), even if the protein content was 50% lower than the control one. The modified cheese showed 21 ± 0.70, 18 ± 1.10 & 25±1.65% higher hardness, cohesiveness and water-holding capacity values, respectively, than the control sample. This behavior could be explained by the developed microstructure of the gel network. Furthermore, chemical-enzymatic modification of milk protein induced a significant change in the network parameter of the final cheese. In this way, the indices of network linkage strength, network linkage density, and time scale of junctions were 10.34 ± 0.52, 68.50 ± 2.10 & 82.21 ± 3.85% higher than the control sample, whereas the distance between adjacent linkages was 16.77 ± 1.10% lower than the control sample. These results were supported by the results of the textural analysis. A non-linear viscoelastic study showed a triangle waveform stress of the modified protein contained cheese, while the control sample showed rectangular waveform stress, which suggested a better sliceability of the modified cheese. Moreover, to study the shelf life of the products, the acidity, as well as molds and yeast population, were determined in 120 days. It’s worth mentioning that the lactose content of modified cheese was adjusted at 2.5% before fermentation, while the lactose of the control one was at 4.5%. The control sample showed 8 weeks shelf life, while the shelf life of the modified cheese was 18 weeks in the refrigerator. During 18 weeks, the acidity of modified and control samples increased from 82 ± 1.50 to 94 ± 2.20 °D and 88 ± 1.64 to 194 ± 5.10 °D, respectively. The mold and yeast populations, with time, followed the semicircular shape model (R2 = 0.92, R2adj = 0.89, RMSE = 1.25). Furthermore, the mold and yeast counts and their growth rate in the modified cheese were lower than those for control one; Aforementioned result could be explained by the shortage of the source of energy for the microorganism in the modified cheese. The lactose content of the modified sample was less than 0.2 ± 0.05% at the end of fermentation, while this was 3.7 ± 0.68% in the control sample.Keywords: non-linear viscoelastic, protein modification, semicircular shape model, ultrafiltered cheese
Procedia PDF Downloads 7528617 A New Multi-Target, Multi-Agent Search and Rescue Path Planning Approach
Authors: Jean Berger, Nassirou Lo, Martin Noel
Abstract:
Perfectly suited for natural or man-made emergency and disaster management situations such as flood, earthquakes, tornadoes, or tsunami, multi-target search path planning for a team of rescue agents is known to be computationally hard, and most techniques developed so far come short to successfully estimate optimality gap. A novel mixed-integer linear programming (MIP) formulation is proposed to optimally solve the multi-target multi-agent discrete search and rescue (SAR) path planning problem. Aimed at maximizing cumulative probability of successful target detection, it captures anticipated feedback information associated with possible observation outcomes resulting from projected path execution, while modeling agent discrete actions over all possible moving directions. Problem modeling further takes advantage of network representation to encompass decision variables, expedite compact constraint specification, and lead to substantial problem-solving speed-up. The proposed MIP approach uses CPLEX optimization machinery, efficiently computing near-optimal solutions for practical size problems, while giving a robust upper bound obtained from Lagrangean integrality constraint relaxation. Should eventually a target be positively detected during plan execution, a new problem instance would simply be reformulated from the current state, and then solved over the next decision cycle. A computational experiment shows the feasibility and the value of the proposed approach.Keywords: search path planning, search and rescue, multi-agent, mixed-integer linear programming, optimization
Procedia PDF Downloads 37228616 Design of a Portable Shielding System for a Newly Installed NaI(Tl) Detector
Authors: Mayesha Tahsin, A.S. Mollah
Abstract:
Recently, a 1.5x1.5 inch NaI(Tl) detector based gamma-ray spectroscopy system has been installed in the laboratory of the Nuclear Science and Engineering Department of the Military Institute of Science and Technology for radioactivity detection purposes. The newly installed NaI(Tl) detector has a circular lead shield of 22 mm width. An important consideration of any gamma-ray spectroscopy is the minimization of natural background radiation not originating from the radioactive sample that is being measured. Natural background gamma-ray radiation comes from naturally occurring or man-made radionuclides in the environment or from cosmic sources. Moreover, the main problem with this system is that it is not suitable for measurements of radioactivity with a large sample container like Petridish or Marinelli beaker geometry. When any laboratory installs a new detector or/and new shield, it “must” first carry out quality and performance tests for the detector and shield. This paper describes a new portable shielding system with lead that can reduce the background radiation. Intensity of gamma radiation after passing the shielding will be calculated using shielding equation I=Ioe-µx where Io is initial intensity of the gamma source, I is intensity after passing through the shield, µ is linear attenuation coefficient of the shielding material, and x is the thickness of the shielding material. The height and width of the shielding will be selected in order to accommodate the large sample container. The detector will be surrounded by a 4π-geometry low activity lead shield. An additional 1.5 mm thick shield of tin and 1 mm thick shield of copper covering the inner part of the lead shielding will be added in order to remove the presence of characteristic X-rays from the lead shield.Keywords: shield, NaI (Tl) detector, gamma radiation, intensity, linear attenuation coefficient
Procedia PDF Downloads 15928615 Task Scheduling and Resource Allocation in Cloud-based on AHP Method
Authors: Zahra Ahmadi, Fazlollah Adibnia
Abstract:
Scheduling of tasks and the optimal allocation of resources in the cloud are based on the dynamic nature of tasks and the heterogeneity of resources. Applications that are based on the scientific workflow are among the most widely used applications in this field, which are characterized by high processing power and storage capacity. In order to increase their efficiency, it is necessary to plan the tasks properly and select the best virtual machine in the cloud. The goals of the system are effective factors in scheduling tasks and resource selection, which depend on various criteria such as time, cost, current workload and processing power. Multi-criteria decision-making methods are a good choice in this field. In this research, a new method of work planning and resource allocation in a heterogeneous environment based on the modified AHP algorithm is proposed. In this method, the scheduling of input tasks is based on two criteria of execution time and size. Resource allocation is also a combination of the AHP algorithm and the first-input method of the first client. Resource prioritization is done with the criteria of main memory size, processor speed and bandwidth. What is considered in this system to modify the AHP algorithm Linear Max-Min and Linear Max normalization methods are the best choice for the mentioned algorithm, which have a great impact on the ranking. The simulation results show a decrease in the average response time, return time and execution time of input tasks in the proposed method compared to similar methods (basic methods).Keywords: hierarchical analytical process, work prioritization, normalization, heterogeneous resource allocation, scientific workflow
Procedia PDF Downloads 146