Search results for: situational variables
2714 Analysis of Selected Hematological Variables during Three Different Menstrual Phases between Sedentary and Sports Women
Authors: G. Vasanthi
Abstract:
The purpose of the study was to analyse the red blood cells and white blood cells during three different menstrual phases between sedentary and sports women. To achieve this purpose, fifteen female sedentary post graduate students (M.A., M.Sc.) and fifteen students of Master of Physical Education and Sports (M.P.Ed.) women who regularly involved in vigouous sports training and participated in sports competition on different games were selected by adopting random sampling method. All the students were hostelers and their age group was between 20 to 22 years. The blood sample were collected during the mid-period of the three different phases to calculate the red blood cells and white blood cells. The data collected were treated statistically by using analysis of variance. The results reveal that the RBC and WBC is found to be significant between sedentary and sports women during the three different menstrual phases.Keywords: RBC, WBC, menstrual, proliferative, secretary, sedentary women, sports women
Procedia PDF Downloads 5062713 The Intention to Use E-Money Transaction: The Moderating Effect of Security in Conceptual Frammework
Authors: Husnil Khatimah, Fairol Halim
Abstract:
This research examines the moderating impact of security on intention to use e-money that adapted from some variables of the TAM (Technology Acceptance Model) and TPB (Theory of Planned Behavior). This study will use security as moderating variable and finds these relationship depends on customer intention to use e-money as payment tools. The conceptual framework of e-money transactions was reviewed to understand behavioral intention of consumers from perceived usefulness, perceived ease of use, perceived behavioral control and security. Quantitative method will be utilized as sources of data collection. A total of one thousand respondents will be selected using quota sampling method in Medan, Indonesia. Descriptive analysis and Multiple Regression analysis will be conducted to analyze the data. The article ended with suggestion for future studies.Keywords: e-money transaction, TAM & TPB, moderating variable, behavioral intention, conceptual paper
Procedia PDF Downloads 4562712 Understanding the Association between Altruism, Personality, and Birth Order among Indian Young Adults
Authors: Shruti Soudi, Anushka Nayak
Abstract:
Altruism is a voluntary helping behavior that is not motivated by rewards. The empathy-altruism hypothesis states that altruistic behavior results from empathy, a constant emotional response between the helper and the individual in need. Individual variances in familiar ways of thinking, feeling, and acting are called personalities. The personality of an individual determines their behavior. More importantly, Adler was among the first psychologists to document the importance of birth order on personality. The present study aims to understand the influence of personality and birth order on altruism. A questionnaire consisting of standardized tools to measure altruism (Hindi Self Report Altruism Scale) and personality (Big Five Personality Inventory) will aid in studying the relationship between these variables among young adults in India. A statistical analysis of the data will be completed using ANOVA and T-Test in the SPSS Software.Keywords: altruism, personality, birth order, ANOVA, young adults
Procedia PDF Downloads 802711 Analytical Hierarchical Process for Multi-Criteria Decision-Making
Authors: Luis Javier Serrano Tamayo
Abstract:
This research on technology makes a first approach to the selection of an amphibious landing ship with strategic capabilities, through the implementation of a multi-criteria model using Analytical Hierarchical Process (AHP), in which a significant group of alternatives of latest technology has been considered. The variables were grouped at different levels to match design and performance characteristics, which affect the lifecycle as well as the acquisition, maintenance and operational costs. The model yielded an overall measure of effectiveness and an overall measure of cost of each kind of ship that was compared each other inside the model and showed in a Pareto chart. The modeling was developed using the Expert Choice software, based on AHP method.Keywords: analytic hierarchy process, multi-criteria decision-making, Pareto analysis, Colombian Marine Corps, projection operations, expert choice, amphibious landing ship
Procedia PDF Downloads 5522710 Work Life Balance Strategies and Retention of Medical Professionals
Authors: Naseem M. Twaissi
Abstract:
Medical professionals play an important role in society, and in general, they care more about their patients than about their personal well-being. They need to take a professional approach to maintain a work-life balance. Through a collection of primary data from 1020 medical professionals and the application of relevant statistical tools, this paper explores the pressures on medical professionals with reference to their work-life balance. This study highlights how hospital management, in addition to economic reasons, needs to identify variables to enhance the work-life balance of medical professionals so that quality healthcare facilities may be provided to the citizens of Jordan. Results indicate that formulation and implementation of policies for enhancing work-life balance together with career and retention plans for medical professionals would enhance the performance of hospitals and the quality of health care in Jordan, leading to greater societal well-being.Keywords: work life balance, job environment, job satisfaction, employee well-being, stress, hospital industry
Procedia PDF Downloads 1432709 The Effect of Feature Selection on Pattern Classification
Authors: Chih-Fong Tsai, Ya-Han Hu
Abstract:
The aim of feature selection (or dimensionality reduction) is to filter out unrepresentative features (or variables) making the classifier perform better than the one without feature selection. Since there are many well-known feature selection algorithms, and different classifiers based on different selection results may perform differently, very few studies consider examining the effect of performing different feature selection algorithms on the classification performances by different classifiers over different types of datasets. In this paper, two widely used algorithms, which are the genetic algorithm (GA) and information gain (IG), are used to perform feature selection. On the other hand, three well-known classifiers are constructed, which are the CART decision tree (DT), multi-layer perceptron (MLP) neural network, and support vector machine (SVM). Based on 14 different types of datasets, the experimental results show that in most cases IG is a better feature selection algorithm than GA. In addition, the combinations of IG with DT and IG with SVM perform best and second best for small and large scale datasets.Keywords: data mining, feature selection, pattern classification, dimensionality reduction
Procedia PDF Downloads 6712708 Multi-Objective Optimization (Pareto Sets) and Multi-Response Optimization (Desirability Function) of Microencapsulation of Emamectin
Authors: Victoria Molina, Wendy Franco, Sergio Benavides, José M. Troncoso, Ricardo Luna, Jose R. PéRez-Correa
Abstract:
Emamectin Benzoate (EB) is a crystal antiparasitic that belongs to the avermectin family. It is one of the most common treatments used in Chile to control Caligus rogercresseyi in Atlantic salmon. However, the sea lice acquired resistance to EB when it is exposed at sublethal EB doses. The low solubility rate of EB and its degradation at the acidic pH in the fish digestive tract are the causes of the slow absorption of EB in the intestine. To protect EB from degradation and enhance its absorption, specific microencapsulation technologies must be developed. Amorphous Solid Dispersion techniques such as Spray Drying (SD) and Ionic Gelation (IG) seem adequate for this purpose. Recently, Soluplus® (SOL) has been used to increase the solubility rate of several drugs with similar characteristics than EB. In addition, alginate (ALG) is a widely used polymer in IG for biomedical applications. Regardless of the encapsulation technique, the quality of the obtained microparticles is evaluated with the following responses, yield (Y%), encapsulation efficiency (EE%) and loading capacity (LC%). In addition, it is important to know the percentage of EB released from the microparticles in gastric (GD%) and intestinal (ID%) digestions. In this work, we microencapsulated EB with SOL (EB-SD) and with ALG (EB-IG) using SD and IG, respectively. Quality microencapsulation responses and in vitro gastric and intestinal digestions at pH 3.35 and 7.8, respectively, were obtained. A central composite design was used to find the optimum microencapsulation variables (amount of EB, amount of polymer and feed flow). In each formulation, the behavior of these variables was predicted with statistical models. Then, the response surface methodology was used to find the best combination of the factors that allowed a lower EB release in gastric conditions, while permitting a major release at intestinal digestion. Two approaches were used to determine this. The desirability approach (DA) and multi-objective optimization (MOO) with multi-criteria decision making (MCDM). Both microencapsulation techniques allowed to maintain the integrity of EB in acid pH, given the small amount of EB released in gastric medium, while EB-IG microparticles showed greater EB release at intestinal digestion. For EB-SD, optimal conditions obtained with MOO plus MCDM yielded a good compromise among the microencapsulation responses. In addition, using these conditions, it is possible to reduce microparticles costs due to the reduction of 60% of BE regard the optimal BE proposed by (DA). For EB-GI, the optimization techniques used (DA and MOO) yielded solutions with different advantages and limitations. Applying DA costs can be reduced 21%, while Y, GD and ID showed 9.5%, 84.8% and 2.6% lower values than the best condition. In turn, MOO yielded better microencapsulation responses, but at a higher cost. Overall, EB-SD with operating conditions selected by MOO seems the best option, since a good compromise between costs and encapsulation responses was obtained.Keywords: microencapsulation, multiple decision-making criteria, multi-objective optimization, Soluplus®
Procedia PDF Downloads 1322707 The Classical Conditioning Effect of Animated Spokes-Characters
Authors: Chia-Ching Tsai, Ting-Hsiu Chen
Abstract:
This paper adopted 2X2 factorial design. One factor was experimental versus control condition. The other factor was types of animated spokescharacter, and one of the two levels was expert type, and the other level is attractive type. In the study, we use control versus experimental conditioning and types of animated spokescharacter as independent variables, and brand attitude as dependent variable to examine the conditioning effect of types of animated spokescharacter on brand attitude. There are 123 subjects participating in the experiment. The results showed conditioning group presents that animated spokescharacter has significantly superior effect of product endorsement in contrast to non-conditioning one, while there is no significant impact of types of animated spokescharacter on brand attitude.Keywords: classical conditioning, animated spokes-character, brand attitude, factorial design
Procedia PDF Downloads 2752706 The Effect of Non-Normality on CB-SEM and PLS-SEM Path Estimates
Authors: Z. Jannoo, B. W. Yap, N. Auchoybur, M. A. Lazim
Abstract:
The two common approaches to Structural Equation Modeling (SEM) are the Covariance-Based SEM (CB-SEM) and Partial Least Squares SEM (PLS-SEM). There is much debate on the performance of CB-SEM and PLS-SEM for small sample size and when distributions are non-normal. This study evaluates the performance of CB-SEM and PLS-SEM under normality and non-normality conditions via a simulation. Monte Carlo Simulation in R programming language was employed to generate data based on the theoretical model with one endogenous and four exogenous variables. Each latent variable has three indicators. For normal distributions, CB-SEM estimates were found to be inaccurate for small sample size while PLS-SEM could produce the path estimates. Meanwhile, for a larger sample size, CB-SEM estimates have lower variability compared to PLS-SEM. Under non-normality, CB-SEM path estimates were inaccurate for small sample size. However, CB-SEM estimates are more accurate than those of PLS-SEM for sample size of 50 and above. The PLS-SEM estimates are not accurate unless sample size is very large.Keywords: CB-SEM, Monte Carlo simulation, normality conditions, non-normality, PLS-SEM
Procedia PDF Downloads 4142705 Financial Assets Return, Economic Factors and Investor's Behavioral Indicators Relationships Modeling: A Bayesian Networks Approach
Authors: Nada Souissi, Mourad Mroua
Abstract:
The main purpose of this study is to examine the interaction between financial asset volatility, economic factors and investor's behavioral indicators related to both the company's and the markets stocks for the period from January 2000 to January2020. Using multiple linear regression and Bayesian Networks modeling, results show a positive and negative relationship between investor's psychology index, economic factors and predicted stock market return. We reveal that the application of the Bayesian Discrete Network contributes to identify the different cause and effect relationships between all economic, financial variables and psychology index.Keywords: Financial asset return predictability, Economic factors, Investor's psychology index, Bayesian approach, Probabilistic networks, Parametric learning
Procedia PDF Downloads 1522704 Robust Variable Selection Based on Schwarz Information Criterion for Linear Regression Models
Authors: Shokrya Saleh A. Alshqaq, Abdullah Ali H. Ahmadini
Abstract:
The Schwarz information criterion (SIC) is a popular tool for selecting the best variables in regression datasets. However, SIC is defined using an unbounded estimator, namely, the least-squares (LS), which is highly sensitive to outlying observations, especially bad leverage points. A method for robust variable selection based on SIC for linear regression models is thus needed. This study investigates the robustness properties of SIC by deriving its influence function and proposes a robust SIC based on the MM-estimation scale. The aim of this study is to produce a criterion that can effectively select accurate models in the presence of vertical outliers and high leverage points. The advantages of the proposed robust SIC is demonstrated through a simulation study and an analysis of a real dataset.Keywords: influence function, robust variable selection, robust regression, Schwarz information criterion
Procedia PDF Downloads 1422703 Comparison of Two Transcranial Magnetic Stimulation Protocols on Spasticity in Multiple Sclerosis - Pilot Study of a Randomized and Blind Cross-over Clinical Trial
Authors: Amanda Cristina da Silva Reis, Bruno Paulino Venâncio, Cristina Theada Ferreira, Andrea Fialho do Prado, Lucimara Guedes dos Santos, Aline de Souza Gravatá, Larissa Lima Gonçalves, Isabella Aparecida Ferreira Moretto, João Carlos Ferrari Corrêa, Fernanda Ishida Corrêa
Abstract:
Objective: To compare two protocols of Transcranial Magnetic Stimulation (TMS) on quadriceps muscle spasticity in individuals diagnosed with Multiple Sclerosis (MS). Method: Clinical, crossover study, in which six adult individuals diagnosed with MS and spasticity in the lower limbs were randomized to receive one session of high-frequency (≥5Hz) and low-frequency (≤ 1Hz) TMS on motor cortex (M1) hotspot for quadriceps muscle, with a one-week interval between the sessions. To assess the spasticity was applied the Ashworth scale and were analyzed the latency time (ms) of the motor evoked potential (MEP) and the central motor conduction time (CMCT) of the bilateral quadriceps muscle. Assessments were performed before and after each intervention. The difference between groups was analyzed using the Friedman test, with a significance level of 0.05 adopted. Results: All statistical analyzes were performed using the SPSS Statistic version 26 programs, with a significance level established for the analyzes at p<0.05. Shapiro Wilk normality test. Parametric data were represented as mean and standard deviation for non-parametric variables, median and interquartile range, and frequency and percentage for categorical variables. There was no clinical change in quadriceps spasticity assessed using the Ashworth scale for the 1 Hz (p=0.813) and 5 Hz (p= 0.232) protocols for both limbs. Motor Evoked Potential latency time: in the 5hz protocol, there was no significant change for the contralateral side from pre to post-treatment (p>0.05), and for the ipsilateral side, there was a decrease in latency time of 0.07 seconds (p<0.05 ); for the 1Hz protocol there was an increase of 0.04 seconds in the latency time (p<0.05) for the contralateral side to the stimulus, and for the ipsilateral side there was a decrease in the latency time of 0.04 seconds (p=<0.05), with a significant difference between the contralateral (p=0.007) and ipsilateral (p=0.014) groups. Central motor conduction time in the 1Hz protocol, there was no change for the contralateral side (p>0.05) and for the ipsilateral side (p>0.05). In the 5Hz protocol for the contralateral side, there was a small decrease in latency time (p<0.05) and for the ipsilateral side, there was a decrease of 0.6 seconds in the latency time (p<0.05) with a significant difference between groups (p=0.019). Conclusion: A high or low-frequency session does not change spasticity, but it is observed that when the low-frequency protocol was performed, there was an increase in latency time on the stimulated side, and a decrease in latency time on the non-stimulated side, considering then that inhibiting the motor cortex increases cortical excitability on the opposite side.Keywords: multiple sclerosis, spasticity, motor evoked potential, transcranial magnetic stimulation
Procedia PDF Downloads 922702 A Formal Microlectic Framework for Biological Circularchy
Authors: Ellis D. Cooper
Abstract:
“Circularchy” is supposed to be an adjustable formal framework with enough expressive power to articulate biological theory about Earthly Life in the sense of multi-scale biological autonomy constrained by non-equilibrium thermodynamics. “Formal framework” means specifically a multi-sorted first-order-theorywithequality (for each sort). Philosophically, such a theory is one kind of “microlect,” which means a “way of speaking” (or, more generally, a “way of behaving”) for overtly expressing a “mental model” of some “referent.” Other kinds of microlect include “natural microlect,” “diagrammatic microlect,” and “behavioral microlect,” with examples such as “political theory,” “Euclidean geometry,” and “dance choreography,” respectively. These are all describable in terms of a vocabulary conforming to grammar. As aspects of human culture, they are possibly reminiscent of Ernst Cassirer’s idea of “symbolic form;” as vocabularies, they are akin to Richard Rorty’s idea of “final vocabulary” for expressing a mental model of one’s life. A formal microlect is presented by stipulating sorts, variables, calculations, predicates, and postulates. Calculations (a.k.a., “terms”) may be composed to form more complicated calculations; predicates (a.k.a., “relations”) may be logically combined to form more complicated predicates; and statements (a.k.a., “sentences”) are grammatically correct expressions which are true or false. Conclusions are statements derived using logical rules of deduction from postulates, other assumed statements, or previously derived conclusions. A circularchy is a formal microlect constituted by two or more sub-microlects, each with its distinct stipulations of sorts, variables, calculations, predicates, and postulates. Within a sub-microlect some postulates or conclusions are equations which are statements that declare equality of specified calculations. An equational bond between an equation in one sub-microlect and an equation in either the same sub-microlect or in another sub-microlect is a predicate that declares equality of symbols occurring in a side of one equation with symbols occurring in a side of the other equation. Briefly, a circularchy is a network of equational bonds between sub-microlects. A circularchy is solvable if there exist solutions for all equations that satisfy all equational bonds. If a circularchy is not solvable, then a challenge would be to discover the obstruction to solvability and then conjecture what adjustments might remove the obstruction. Adjustment means changes in stipulated ingredients (sorts, etc.) of sub-microlects, or changes in equational bonds between sub-microlects, or introduction of new sub-microlects and new equational bonds. A circularchy is modular insofar as each sub-microlect is a node in a network of equation bonds. Solvability of a circularchy may be conjectured. Efforts to prove solvability may be thwarted by a counter-example or may lead to the construction of a solution. An automated theorem-proof assistant would likely be necessary for investigating a substantial circularchy, such as one purported to represent Earthly Life. Such investigations (chains of statements) would be concurrent with and no substitute for simulations (chains of numbers).Keywords: autonomy, first-order theory, mathematics, thermodynamics
Procedia PDF Downloads 2212701 Potential Impacts of Climate Change on Hydrological Droughts in the Limpopo River Basin
Authors: Nokwethaba Makhanya, Babatunde J. Abiodun, Piotr Wolski
Abstract:
Climate change possibly intensifies hydrological droughts and reduces water availability in river basins. Despite this, most research on climate change effects in southern Africa has focused exclusively on meteorological droughts. This thesis projects the potential impact of climate change on the future characteristics of hydrological droughts in the Limpopo River Basin (LRB). The study uses regional climate model (RCM) measurements (from the Coordinated Regional Climate Downscaling Experiment, CORDEX) and a combination of hydrological simulations (using the Soil and Water Assessment Tool Plus model, SWAT+) to predict the impacts at four global warming levels (GWLs: 1.5℃, 2.0℃, 2.5℃, and 3.0℃) under the RCP8.5 future climate scenario. The SWAT+ model was calibrated and validated with a streamflow dataset observed over the basin, and the sensitivity of model parameters was investigated. The performance of the SWAT+LRB model was verified using the Nash-Sutcliffe efficiency (NSE), Percent Bias (PBIAS), Root Mean Square Error (RMSE), and coefficient of determination (R²). The Standardized Precipitation Evapotranspiration Index (SPEI) and the Standardized Precipitation Index (SPI) have been used to detect meteorological droughts. The Soil Water Index (SSI) has been used to define agricultural drought, while the Water Yield Drought Index (WYLDI), the Surface Run-off Index (SRI), and the Streamflow Index (SFI) have been used to characterise hydrological drought. The performance of the SWAT+ model simulations over LRB is sensitive to the parameters CN2 (initial SCS runoff curve number for moisture condition II) and ESCO (soil evaporation compensation factor). The best simulation generally performed better during the calibration period than the validation period. In calibration and validation periods, NSE is ≤ 0.8, while PBIAS is ≥ ﹣80.3%, RMSE ≥ 11.2 m³/s, and R² ≤ 0.9. The simulations project a future increase in temperature and potential evapotranspiration over the basin, but they do not project a significant future trend in precipitation and hydrological variables. However, the spatial distribution of precipitation reveals a projected increase in precipitation in the southern part of the basin and a decline in the northern part of the basin, with the region of reduced precipitation projected to increase with GWLs. A decrease in all hydrological variables is projected over most parts of the basin, especially over the eastern part of the basin. The simulations predict meteorological droughts (i.e., SPEI and SPI), agricultural droughts (i.e., SSI), and hydrological droughts (i.e., WYLDI, SRI) would become more intense and severe across the basin. SPEI-drought has a greater magnitude of increase than SPI-drought, and agricultural and hydrological droughts have a magnitude of increase between the two. As a result, this research suggests that future hydrological droughts over the LRB could be more severe than the SPI-drought projection predicts but less severe than the SPEI-drought projection. This research can be used to mitigate the effects of potential climate change on basin hydrological drought.Keywords: climate change, CORDEX, drought, hydrological modelling, Limpopo River Basin
Procedia PDF Downloads 1302700 Exergy: An Effective Tool to Quantify Sustainable Development of Biodiesel Production
Authors: Mahmoud Karimi, Golmohammad Khoobbakht
Abstract:
This study focuses on the exergy flow analysis in the transesterification of waste cooking oil with methanol to decrease the consumption of materials and energy and promote the use of renewable resources. The exergy analysis performed is based on the thermodynamic performance parameters namely exergy destruction and exergy efficiency to investigate the effects of variable parameters on renewability of transesterification. The experiment variables were methanol to WCO ratio, catalyst concentration and reaction temperature in the transesterification reaction. The optimum condition with yield of 90.2% and exergy efficiency of 95.2% was obtained at methanol to oil molar ratio of 8:1, 1 wt.% of KOH, at 55 °C. In this condition, the total waste exergy was found to be 45.4 MJ for 1 kg biodiesel production. However high yield in the optimal condition resulted high exergy efficiency in the transesterification of WCO with methanol.Keywords: biodiesel, exergy, thermodynamic analysis, transesterification, waste cooking oil
Procedia PDF Downloads 1952699 Naphtha Catalytic Reform: Modeling and Simulation of Unity
Authors: Leal Leonardo, Pires Carlos Augusto de Moraes, Casiraghi Magela
Abstract:
In this work were realized the modeling and simulation of the catalytic reformer process, of ample form, considering all the equipment that influence the operation performance. Considered it a semi-regenerative reformer, with four reactors in series intercalated with four furnaces, two heat exchanges, one product separator and one recycle compressor. A simplified reactional system was considered, involving only ten chemical compounds related through five reactions. The considered process was the applied to aromatics production (benzene, toluene, and xylene). The models developed to diverse equipment were interconnecting in a simulator that consists of a computer program elaborate in FORTRAN 77. The simulation of the global model representative of reformer unity achieved results that are compatibles with the literature ones. It was then possible to study the effects of operational variables in the products concentration and in the performance of the unity equipment.Keywords: catalytic reforming, modeling, simulation, petrochemical engineering
Procedia PDF Downloads 5182698 Empirical and Indian Automotive Equity Portfolio Decision Support
Authors: P. Sankar, P. James Daniel Paul, Siddhant Sahu
Abstract:
A brief review of the empirical studies on the methodology of the stock market decision support would indicate that they are at a threshold of validating the accuracy of the traditional and the fuzzy, artificial neural network and the decision trees. Many researchers have been attempting to compare these models using various data sets worldwide. However, the research community is on the way to the conclusive confidence in the emerged models. This paper attempts to use the automotive sector stock prices from National Stock Exchange (NSE), India and analyze them for the intra-sectorial support for stock market decisions. The study identifies the significant variables and their lags which affect the price of the stocks using OLS analysis and decision tree classifiers.Keywords: Indian automotive sector, stock market decisions, equity portfolio analysis, decision tree classifiers, statistical data analysis
Procedia PDF Downloads 4862697 Neural Network Based Control Algorithm for Inhabitable Spaces Applying Emotional Domotics
Authors: Sergio A. Navarro Tuch, Martin Rogelio Bustamante Bello, Leopoldo Julian Lechuga Lopez
Abstract:
In recent years, Mexico’s population has seen a rise of different physiological and mental negative states. Two main consequences of this problematic are deficient work performance and high levels of stress generating and important impact on a person’s physical, mental and emotional health. Several approaches, such as the use of audiovisual stimulus to induce emotions and modify a person’s emotional state, can be applied in an effort to decreases these negative effects. With the use of different non-invasive physiological sensors such as EEG, luminosity and face recognition we gather information of the subject’s current emotional state. In a controlled environment, a subject is shown a series of selected images from the International Affective Picture System (IAPS) in order to induce a specific set of emotions and obtain information from the sensors. The raw data obtained is statistically analyzed in order to filter only the specific groups of information that relate to a subject’s emotions and current values of the physical variables in the controlled environment such as, luminosity, RGB light color, temperature, oxygen level and noise. Finally, a neural network based control algorithm is given the data obtained in order to feedback the system and automate the modification of the environment variables and audiovisual content shown in an effort that these changes can positively alter the subject’s emotional state. During the research, it was found that the light color was directly related to the type of impact generated by the audiovisual content on the subject’s emotional state. Red illumination increased the impact of violent images and green illumination along with relaxing images decreased the subject’s levels of anxiety. Specific differences between men and women were found as to which type of images generated a greater impact in either gender. The population sample was mainly constituted by college students whose data analysis showed a decreased sensibility to violence towards humans. Despite the early stage of the control algorithm, the results obtained from the population sample give us a better insight into the possibilities of emotional domotics and the applications that can be created towards the improvement of performance in people’s lives. The objective of this research is to create a positive impact with the application of technology to everyday activities; nonetheless, an ethical problem arises since this can also be applied to control a person’s emotions and shift their decision making.Keywords: data analysis, emotional domotics, performance improvement, neural network
Procedia PDF Downloads 1432696 The Fiscal-Monetary Policy and Economic Growth in Algeria: VECM Approach
Authors: K. Bokreta, D. Benanaya
Abstract:
The objective of this study is to examine the relative effectiveness of monetary and fiscal policy in Algeria using the econometric modelling techniques of cointegration and vector error correction modelling to analyse and draw policy inferences. The chosen variables of fiscal policy are government expenditure and net taxes on products, while the effect of monetary policy is presented by the inflation rate and the official exchange rate. From the results, we find that in the long-run, the impact of government expenditures is positive, while the effect of taxes is negative on growth. Additionally, we find that the inflation rate is found to have little effect on GDP per capita but the impact of the exchange rate is insignificant. We conclude that fiscal policy is more powerful then monetary policy in promoting economic growth in Algeria.Keywords: economic growth, monetary policy, fiscal policy, VECM
Procedia PDF Downloads 3122695 Evolving Credit Scoring Models using Genetic Programming and Language Integrated Query Expression Trees
Authors: Alexandru-Ion Marinescu
Abstract:
There exist a plethora of methods in the scientific literature which tackle the well-established task of credit score evaluation. In its most abstract form, a credit scoring algorithm takes as input several credit applicant properties, such as age, marital status, employment status, loan duration, etc. and must output a binary response variable (i.e. “GOOD” or “BAD”) stating whether the client is susceptible to payment return delays. Data imbalance is a common occurrence among financial institution databases, with the majority being classified as “GOOD” clients (clients that respect the loan return calendar) alongside a small percentage of “BAD” clients. But it is the “BAD” clients we are interested in since accurately predicting their behavior is crucial in preventing unwanted loss for loan providers. We add to this whole context the constraint that the algorithm must yield an actual, tractable mathematical formula, which is friendlier towards financial analysts. To this end, we have turned to genetic algorithms and genetic programming, aiming to evolve actual mathematical expressions using specially tailored mutation and crossover operators. As far as data representation is concerned, we employ a very flexible mechanism – LINQ expression trees, readily available in the C# programming language, enabling us to construct executable pieces of code at runtime. As the title implies, they model trees, with intermediate nodes being operators (addition, subtraction, multiplication, division) or mathematical functions (sin, cos, abs, round, etc.) and leaf nodes storing either constants or variables. There is a one-to-one correspondence between the client properties and the formula variables. The mutation and crossover operators work on a flattened version of the tree, obtained via a pre-order traversal. A consequence of our chosen technique is that we can identify and discard client properties which do not take part in the final score evaluation, effectively acting as a dimensionality reduction scheme. We compare ourselves with state of the art approaches, such as support vector machines, Bayesian networks, and extreme learning machines, to name a few. The data sets we benchmark against amount to a total of 8, of which we mention the well-known Australian credit and German credit data sets, and the performance indicators are the following: percentage correctly classified, area under curve, partial Gini index, H-measure, Brier score and Kolmogorov-Smirnov statistic, respectively. Finally, we obtain encouraging results, which, although placing us in the lower half of the hierarchy, drive us to further refine the algorithm.Keywords: expression trees, financial credit scoring, genetic algorithm, genetic programming, symbolic evolution
Procedia PDF Downloads 1202694 The Factors to Determine the Content About Gender and Sexuality Education Among Adolescents in China
Authors: Yixiao Tang
Abstract:
The risks of adolescents being exposed to sexually transmitted diseases (STDs) and participating in unsafe sexual practices are increasing. There is the necessity and significance of providing adolescents with appropriate sex education, considering they are at the stage of life exploration and risk-taking. However, in delivering sex education, the contents and instruction methods are usually discussed with contextual differences. In the Chinese context, the socially prejudiced perceptions of homosexuality can be attributed to the traditional Chinese Confucian philosophy, which has been dominating Chinese education for thousands of years. In China, students rarely receive adequate information about HIV, STDs, the use of contraceptives, pregnancies, and other sexually related topics in their formal education. Underlying the Confucian cultural background, this essay will analyze the variables that determine the subject matter of sex education for adolescents and then discuss how this cultural form affects social views and policy on sex education.Keywords: homosexuality education, adolescent, China, education policy
Procedia PDF Downloads 792693 A Strategy for the Application of Second-Order Monte Carlo Algorithms to Petroleum Exploration and Production Projects
Authors: Obioma Uche
Abstract:
Due to the recent volatility in oil & gas prices as well as increased development of non-conventional resources, it has become even more essential to critically evaluate the profitability of petroleum prospects prior to making any investment decisions. Traditionally, simple Monte Carlo (MC) algorithms have been used to randomly sample probability distributions of economic and geological factors (e.g. price, OPEX, CAPEX, reserves, productive life, etc.) in order to obtain probability distributions for profitability metrics such as Net Present Value (NPV). In recent years, second-order MC algorithms have been shown to offer an advantage over simple MC techniques due to the added consideration of uncertainties associated with the probability distributions of the relevant variables. Here, a strategy for the application of the second-order MC technique to a case study is demonstrated to analyze its effectiveness as a tool for portfolio management.Keywords: Monte Carlo algorithms, portfolio management, profitability, risk analysis
Procedia PDF Downloads 3402692 Agent-Base Modeling of IoT Applications by Using Software Product Line
Authors: Asad Abbas, Muhammad Fezan Afzal, Muhammad Latif Anjum, Muhammad Azmat
Abstract:
The Internet of Things (IoT) is used to link up real objects that use the internet to interact. IoT applications allow handling and operating the equipment in accordance with environmental needs, such as transportation and healthcare. IoT devices are linked together via a number of agents that act as a middleman for communications. The operation of a heat sensor differs indoors and outside because agent applications work with environmental variables. In this article, we suggest using Software Product Line (SPL) to model IoT agents and applications' features on an XML-based basis. The contextual diversity within the same domain of application can be handled, and the reusability of features is increased by XML-based feature modelling. For the purpose of managing contextual variability, we have embraced XML for modelling IoT applications, agents, and internet-connected devices.Keywords: IoT agents, IoT applications, software product line, feature model, XML
Procedia PDF Downloads 982691 Parental Rejection and Psychological Adjustment among Adolescents: Does the Peer Rejection Mediate?
Authors: Sultan Shujja, Farah Malik
Abstract:
The study examined the mediating role of peer rejection in direct relationship of parental rejection and psychological adjustment among adolescents. Researchers used self-report measures e.g., Parental Acceptance-Rejection Questionnaire (PARQ), Children Rejection Sensitivity Questionnaire (PARQ), and Personality Assessment Questionnaire (PAQ) to assess perception of parent-peer rejection, psychological adjustment among adolescents (14-18 years). Findings revealed that peer rejection did not mediate the parental rejection and psychological adjustment whereas parental rejection emerged as strong predictor when demographic variables were statistically controlled. On average, girls were psychologically less adjusted than that of boys. Despite of equal perception of peer rejection, girls more anxiously anticipated peer rejection than did the boys. It is suggested that peer influence on adolescents, specifically girls, should not be underestimated.Keywords: peer relationships, parental perception, psychological adjustment, applied psychology
Procedia PDF Downloads 5142690 Relationship of Workplace Stress and Mental Wellbeing among Health Professionals
Authors: Rabia Mushtaq, Uroosa Javaid
Abstract:
It has been observed that health professionals are at higher danger of stress in light of the fact that being a specialist is physically and emotionally demanding. The study aimed to investigate the relationship between workplace stress and mental wellbeing among health professionals. Sample of 120 male and female health professionals belonging to two age groups, i.e., early adulthood and middle adulthood, was employed through purposive sampling technique. Job stress scale, mindful attention awareness scale, and Warwick Edinburgh mental wellbeing scales were used for the measurement of study variables. Results of the study indicated that job stress has a significant negative relationship with mental wellbeing among health professionals. The current study opened the door for more exploratory work on mindfulness among health professionals. Yielding outcomes helped in consolidating adapting procedures among workers to improve their mental wellbeing and lessen the job stress.Keywords: health professionals, job stress, mental wellbeing, mindfulness
Procedia PDF Downloads 1772689 Corporate Governance in Africa: A Review of Literature
Authors: Kisanga Arsene
Abstract:
The abundant literature on corporate governance identifies four main objectives: the configuration of power within firms, control, conflict prevention and the equitable distribution of value created. The persistent dysfunctions in companies in developing countries in general and in African countries, in particular, show that these objectives are generally not achieved, which supports the idea of analyzing corporate governance practices in Africa. Indeed, the objective of this paper is to review the literature on corporate governance in Africa, to outline the specific practices and challenges of corporate governance in Africa and to identify reliable indicators and variables to capture corporate governance in Africa. In light of the existing literature, we argue that corporate governance in Africa can only be studied in the light of African realities and by taking into account the institutional environment. These studies show the existence of a divide between governance practices and the legislative and regulatory texts in force in the African context.Keywords: institutional environment, transparency, accountability, Africa
Procedia PDF Downloads 1802688 An Artificial Intelligence Framework to Forecast Air Quality
Authors: Richard Ren
Abstract:
Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms
Procedia PDF Downloads 1302687 Palliative Care Referral Behavior Among Nurse Practitioners in Hospital Medicine
Authors: Sharon Jackson White
Abstract:
Purpose: Nurse practitioners (NPs) practicing within hospital medicine play a significant role in caring for patients who might benefit from palliative care (PC) services. Using the Theory of Planned Behavior, the purpose of this study was to examine the relationships among facilitators to referral, barriers to referral, self-efficacy with end-of-life discussions, history of referral, and referring to PC among NPs in hospital medicine. Hypotheses: 1) Perceived facilitators to referral will be associated with a higher history of referral and a higher number of referrals to PC. 2) Perceived barriers to referral will be associated with a lower history of referral and a lower number of referrals to PC. 3) Increased self-efficacy with end-of-life discussions will be associated with a higher history of referral and a higher number of referrals to PC. 4) Perceived facilitators to referral, perceived barriers to referral, and self–efficacy with end-of-life discussions will contribute to a significant variance in the history of referral to PC. 5) Perceived facilitators to referral, perceived barriers to referral, and self–efficacy with end-of-life discussions will contribute to a significant variance in the number of referrals to PC. Significance: Previous studies of referring patients to PC within the hospital setting care have focused on physician practices. Identifying factors that influence NPs referring hospitalized patients to PC is essential to ensure that patients have access to these important services. This study incorporates the SNRS mission of advancing nursing research through the dissemination of research findings and the promotion of nursing science. Methods: A cross-sectional, predictive correlational study was conducted. History of referral to PC, facilitators to referring to PC, barriers to referring to PC, self-efficacy in end-of-life discussions, and referral to PC were measured using the PC referral case study survey, facilitators and barriers to PC referral survey, and self-assessment with end-of-life discussions survey. Data were analyzed descriptively and with Pearson’s Correlation, Spearman’s Rho, point-biserial correlation, multiple regression, logistic regression, Chi-Square test, and the Mann-Whitney U test. Results: Only one facilitator (PC team being helpful with establishing goals of care) was significantly associated with referral to PC. Three variables were statistically significant in relation to the history of referring to PC: “Inclined to refer: PC can help decrease the length of stay in hospital”, “Most inclined to refer: Patients with serious illnesses and/or poor prognoses”, and “Giving bad news to a patient or family member”. No predictor variables contributed a significant variance in the number of referrals to PC for all three case studies. There were no statistically significant results showing a relationship between the history of referral and referral to PC. All five hypotheses were partially supported. Discussion: Findings from this study emphasize the need for further research on NPs who work in hospital settings and what factors influence their behaviors of referring to PC. Since there is an increase in NPs practicing within hospital settings, future studies should use a larger sample size and incorporate hospital medicine NPs and other types of NPs that work in hospitals.Keywords: palliative care, nurse practitioners, hospital medicine, referral
Procedia PDF Downloads 752686 Optimum Dispatching Rule in Solar Ingot-Wafer Manufacturing System
Authors: Wheyming Song, Hung-Hsiang Lin, Scott Lian
Abstract:
In this research, we investigate the optimal dispatching rule for machines and manpower allocation in the solar ingot-wafer systems. The performance of the method is measured by the sales profit for each dollar paid to the operators in a one week at steady-state. The decision variables are identification-number of machines and operators when each job is required to be served in each process. We propose a rule which is a function of operator’s ability, corresponding salary, and standing location while in the factory. The rule is named ‘Multi-nominal distribution dispatch rule’. The proposed rule performs better than many traditional rules including generic algorithm and particle swarm optimization. Simulation results show that the proposed Multi-nominal distribution dispatch rule improvement on the sales profit dramatically.Keywords: dispatching, solar ingot, simulation, flexsim
Procedia PDF Downloads 3012685 Targeted Effects of Subsidies on Prices of Selected Commodities in Iran Market
Authors: Sayedramin Hashemianesfehani, Seyed Hossein Hosseinilargani
Abstract:
In this study, we attempt to realize that to what extent the increase in selected commodities in Iran Market is originated from the implementation of the targeted subsidies law. Hence, an econometric model based on existing theories of increasing and transferring prices in order to transferring inflation is developed. In other words, world price index and virtual variables defined for targeted subsidies has significant and positive impact on the producer price index. The obtained results indicated that the targeted subsidies act in Iran has influential long and short-term impacts on producer price indexes. Finally, world prices of dairy products and dairy price with respect to major parameters is carried out to obtain some managerial results.Keywords: econometric models, targeted subsidies, consumer price index (CPI), producer price index (PPI)
Procedia PDF Downloads 361