Search results for: Statistical Approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16739

Search results for: Statistical Approach

15539 Effectiveness of Acceptance and Commitment Therapy on Reducing Corona Disease Anxiety in the Staff Working in Shahid Beheshti Hospital of Shiraz

Authors: Gholam Reza Mirzaei

Abstract:

This research aimed to investigate the effectiveness of acceptance and commitment therapy (ACT) in reducing corona disease anxiety in the staff working at Shahid Beheshti Hospital of Shiraz. The current research was a quasi-experimental study having pre-test and post-test with two experimental and control groups. The statistical population of the research included all the staff of Shahid Beheshti Hospital of Shiraz in 2021. From among the statistical population, 30 participants (N =15 in the experimental group and N =15 in the control group) were selected by available sampling. The materials used in the study comprised the Cognitive Emotion Regulation Questionnaire (CERQ) and Corona Disease Anxiety Scale (CDAS). Following data collection, the participants’ scores were analyzed using SPSS 20 at both descriptive (mean and standard deviation) and inferential (analysis of covariance) levels. The results of the analysis of covariance (ANCOVA) showed that acceptance and commitment therapy (ACT) is effective in reducing Corona disease anxiety (mental and physical symptoms) in the staff working at Shahid Beheshti Hospital of Shiraz. The effectiveness of acceptance and commitment therapy (ACT) on reducing mental symptoms was 25.5% and on physical symptoms was 13.8%. The mean scores of the experimental group in the sub-scales of Corona disease anxiety (mental and physical symptoms) in the post-test were lower than the mean scores of the control group.

Keywords: acceptance and commitment therapy, corona disease anxiety, hospital staff, Shiraz

Procedia PDF Downloads 27
15538 Impact of Ethnoscience-Based Teaching Approach: Thinking Relevance, Effectiveness and Learner Retention in Physics Concepts of Optics

Authors: Rose C.Anamezie, Mishack T. Gumbo

Abstract:

Physics learners’ poor retention, which culminates in poor achievement due to teaching approaches that are unrelated to learners’ in non-Western cultures, warranted the study. The tenet of this study was to determine the effectiveness of the ethnoscience-based teaching (EBT) approach on learners’ retention in the Physics concept of Optics in the Awka Education zone of Anambra State- Nigeria. Two research questions and three null hypotheses tested at a 0.05 level of significance guided the study. The design adopted for the study was Quasi-experimental. Specifically, a non-equivalent control group design was adopted. The population for the study was 4,825 SS2 Physics learners in the zone. 160 SS2 learners were sampled using purposive and random sampling. The experimental group was taught rectilinear propagation of light (RPL) using the EBT approach, while the control group was taught the same topic using the lecture method. The instrument for data collection was the 50 Physics Retention Test (PRT) which was validated by three experts and tested for reliability using Kuder-Richardson’s formula-20, which yielded coefficients of 0.81. The data were analysed using mean, standard deviation and analysis of co-variance (p< .05). The results showed higher retention for the use of the EBT approach than the lecture method, while there was no significant gender-based factor in the learners’ retention in Physics. It was recommended that the EBT approach, which bridged the gender gap in Physics retention, be adopted in secondary school teaching and learning since it could transform science teaching, enhance learners’ construction of new science concepts based on their existing knowledge and bridge the gap between Western science and learners’ worldviews.

Keywords: Ethnoscience-based teaching, optics, rectilinear propagation of light, retention

Procedia PDF Downloads 75
15537 A Bayesian Approach for Analyzing Academic Article Structure

Authors: Jia-Lien Hsu, Chiung-Wen Chang

Abstract:

Research articles may follow a simple and succinct structure of organizational patterns, called move. For example, considering extended abstracts, we observe that an extended abstract usually consists of five moves, including Background, Aim, Method, Results, and Conclusion. As another example, when publishing articles in PubMed, authors are encouraged to provide a structured abstract, which is an abstract with distinct and labeled sections (e.g., Introduction, Methods, Results, Discussions) for rapid comprehension. This paper introduces a method for computational analysis of move structures (i.e., Background-Purpose-Method-Result-Conclusion) in abstracts and introductions of research documents, instead of manually time-consuming and labor-intensive analysis process. In our approach, sentences in a given abstract and introduction are automatically analyzed and labeled with a specific move (i.e., B-P-M-R-C in this paper) to reveal various rhetorical status. As a result, it is expected that the automatic analytical tool for move structures will facilitate non-native speakers or novice writers to be aware of appropriate move structures and internalize relevant knowledge to improve their writing. In this paper, we propose a Bayesian approach to determine move tags for research articles. The approach consists of two phases, training phase and testing phase. In the training phase, we build a Bayesian model based on a couple of given initial patterns and the corpus, a subset of CiteSeerX. In the beginning, the priori probability of Bayesian model solely relies on initial patterns. Subsequently, with respect to the corpus, we process each document one by one: extract features, determine tags, and update the Bayesian model iteratively. In the testing phase, we compare our results with tags which are manually assigned by the experts. In our experiments, the promising accuracy of the proposed approach reaches 56%.

Keywords: academic English writing, assisted writing, move tag analysis, Bayesian approach

Procedia PDF Downloads 321
15536 The Control of Wall Thickness Tolerance during Pipe Purchase Stage Based on Reliability Approach

Authors: Weichao Yu, Kai Wen, Weihe Huang, Yang Yang, Jing Gong

Abstract:

Metal-loss corrosion is a major threat to the safety and integrity of gas pipelines as it may result in the burst failures which can cause severe consequences that may include enormous economic losses as well as the personnel casualties. Therefore, it is important to ensure the corroding pipeline integrity and efficiency, considering the value of wall thickness, which plays an important role in the failure probability of corroding pipeline. Actually, the wall thickness is controlled during pipe purchase stage. For example, the API_SPEC_5L standard regulates the allowable tolerance of the wall thickness from the specified value during the pipe purchase. The allowable wall thickness tolerance will be used to determine the wall thickness distribution characteristic such as the mean value, standard deviation and distribution. Taking the uncertainties of the input variables in the burst limit-state function into account, the reliability approach rather than the deterministic approach will be used to evaluate the failure probability. Moreover, the cost of pipe purchase will be influenced by the allowable wall thickness tolerance. More strict control of the wall thickness usually corresponds to a higher pipe purchase cost. Therefore changing the wall thickness tolerance will vary both the probability of a burst failure and the cost of the pipe. This paper describes an approach to optimize the wall thickness tolerance considering both the safety and economy of corroding pipelines. In this paper, the corrosion burst limit-state function in Annex O of CSAZ662-7 is employed to evaluate the failure probability using the Monte Carlo simulation technique. By changing the allowable wall thickness tolerance, the parameters of the wall thickness distribution in the limit-state function will be changed. Using the reliability approach, the corresponding variations in the burst failure probability will be shown. On the other hand, changing the wall thickness tolerance will lead to a change in cost in pipe purchase. Using the variation of the failure probability and pipe cost caused by changing wall thickness tolerance specification, the optimal allowable tolerance can be obtained, and used to define pipe purchase specifications.

Keywords: allowable tolerance, corroding pipeline segment, operation cost, production cost, reliability approach

Procedia PDF Downloads 385
15535 Comparison of Methods of Estimation for Use in Goodness of Fit Tests for Binary Multilevel Models

Authors: I. V. Pinto, M. R. Sooriyarachchi

Abstract:

It can be frequently observed that the data arising in our environment have a hierarchical or a nested structure attached with the data. Multilevel modelling is a modern approach to handle this kind of data. When multilevel modelling is combined with a binary response, the estimation methods get complex in nature and the usual techniques are derived from quasi-likelihood method. The estimation methods which are compared in this study are, marginal quasi-likelihood (order 1 & order 2) (MQL1, MQL2) and penalized quasi-likelihood (order 1 & order 2) (PQL1, PQL2). A statistical model is of no use if it does not reflect the given dataset. Therefore, checking the adequacy of the fitted model through a goodness-of-fit (GOF) test is an essential stage in any modelling procedure. However, prior to usage, it is also equally important to confirm that the GOF test performs well and is suitable for the given model. This study assesses the suitability of the GOF test developed for binary response multilevel models with respect to the method used in model estimation. An extensive set of simulations was conducted using MLwiN (v 2.19) with varying number of clusters, cluster sizes and intra cluster correlations. The test maintained the desirable Type-I error for models estimated using PQL2 and it failed for almost all the combinations of MQL. Power of the test was adequate for most of the combinations in all estimation methods except MQL1. Moreover, models were fitted using the four methods to a real-life dataset and performance of the test was compared for each model.

Keywords: goodness-of-fit test, marginal quasi-likelihood, multilevel modelling, penalized quasi-likelihood, power, quasi-likelihood, type-I error

Procedia PDF Downloads 132
15534 Assessment of E-Readiness in Libraries of Public Sector Universities Khyber Pakhtunkhwa-Pakistan

Authors: Saeed Ullah Jan

Abstract:

This study has examined the e-readiness in libraries of public sector universities in Khyber Pakhtunkhwa. Efforts were made to evaluate the availability of human resources, electronic infrastructure, and network services and programs in the public sector university libraries. The population of the study was the twenty-seven public sector university libraries of Khyber Pakhtunkhwa. A quantitative approach was adopted, and a questionnaire-based survey was conducted to collect data from the librarian/in charge of public sector university libraries. The collected data were analyzed using Statistical Package for Social Sciences version 22 (SPSS). The mean score of the knowledge component interpreted magnitudes below three which indicates that the respondents are poorly or moderately satisfied regards knowledge of libraries. The satisfaction level of the respondents about the other components, such as electronic infrastructure, network services and programs, and enhancers of the networked world, was rated as average or below. The study suggested that major aspects of existing public-sector university libraries require significant transformation. For this purpose, the government should provide all the required resources and facilities to meet the population's informational and recreational demands. The Information Communication Technology (ICT) infrastructure of public university libraries needs improvement in terms of the availability of computer equipment, databases, network servers, multimedia projectors, digital cameras, uninterruptible power supply, scanners, and backup devices such as hard discs and Digital Video Disc/Compact Disc.

Keywords: ICT-libraries, e-readiness-libraries, e-readiness-university libraries, e-readiness-Pakistan

Procedia PDF Downloads 77
15533 Using Textual Pre-Processing and Text Mining to Create Semantic Links

Authors: Ricardo Avila, Gabriel Lopes, Vania Vidal, Jose Macedo

Abstract:

This article offers a approach to the automatic discovery of semantic concepts and links in the domain of Oil Exploration and Production (E&P). Machine learning methods combined with textual pre-processing techniques were used to detect local patterns in texts and, thus, generate new concepts and new semantic links. Even using more specific vocabularies within the oil domain, our approach has achieved satisfactory results, suggesting that the proposal can be applied in other domains and languages, requiring only minor adjustments.

Keywords: semantic links, data mining, linked data, SKOS

Procedia PDF Downloads 166
15532 Animal Welfare Violations during Treatment at Different Level of Veterinary Hospitals

Authors: Aparna Datta, Mahabub Alam

Abstract:

Animal welfare is comparatively new area of research in Bangladesh and welfare concern for animal is increasing day by day. The study was conducted to investigate the animal welfare violations during treatment at different level of hospitals in Bangladesh and India. This study was conducted between January and May, 2017. The recorded data (N=180) were categorized into eight major types of violation like - delay in starting treatment, non-specific treatment, surgery without anesthesia, use of unsterilized needle, rough and painful handling, fearful approach, multiple pricking during injection and use of blunt needle. Categorized groups were analyzed according to different hospitals like Upazila Veterinary Hospitals, Bangladesh (UVHs), SAQ-Teaching Veterinary Hospital, Bangladesh (SAQTVH) and Veterinary College and Research Institute, India (VCRI). Among all hospitals, violation during treatment more frequently occurred in UVH. Among all violations, surgery without anesthesia was only found in UVH (80%) and it was belong to considerable number of cases (80%). In the view of other major violations like - non-specific treatment was 69% in UVHs, 13% in SAQTVH and 5% in VCRI. Use of unsterilized instruments during treatment was also higher in UVHs (65%) than SAQTVH (5%) and VCRI (1%). But delay in starting treatment varied insignificantly and it was 26-42% across the different levels of hospitals. Although multiple pricking during injection was found 30% cases in UVH, but statistical variations with other level of hospitals were unnoticed (p>0.05). The findings of this study will help to take necessary steps to control violation against animal welfare during treatment. A comprehensive study considering all levels of hospitals including field treatment is also recommended to find out the welfare violations during treatment.

Keywords: animal welfare, treatment, veterinary hospitals, violations

Procedia PDF Downloads 145
15531 Predicting Polyethylene Processing Properties Based on Reaction Conditions via a Coupled Kinetic, Stochastic and Rheological Modelling Approach

Authors: Kristina Pflug, Markus Busch

Abstract:

Being able to predict polymer properties and processing behavior based on the applied operating reaction conditions in one of the key challenges in modern polymer reaction engineering. Especially, for cost-intensive processes such as the high-pressure polymerization of low-density polyethylene (LDPE) with high safety-requirements, the need for simulation-based process optimization and product design is high. A multi-scale modelling approach was set-up and validated via a series of high-pressure mini-plant autoclave reactor experiments. The approach starts with the numerical modelling of the complex reaction network of the LDPE polymerization taking into consideration the actual reaction conditions. While this gives average product properties, the complex polymeric microstructure including random short- and long-chain branching is calculated via a hybrid Monte Carlo-approach. Finally, the processing behavior of LDPE -its melt flow behavior- is determined in dependence of the previously determined polymeric microstructure using the branch on branch algorithm for randomly branched polymer systems. All three steps of the multi-scale modelling approach can be independently validated against analytical data. A triple-detector GPC containing an IR, viscosimetry and multi-angle light scattering detector is applied. It serves to determine molecular weight distributions as well as chain-length dependent short- and long-chain branching frequencies. 13C-NMR measurements give average branching frequencies, and rheological measurements in shear and extension serve to characterize the polymeric flow behavior. The accordance of experimental and modelled results was found to be extraordinary, especially taking into consideration that the applied multi-scale modelling approach does not contain parameter fitting of the data. This validates the suggested approach and proves its universality at the same time. In the next step, the modelling approach can be applied to other reactor types, such as tubular reactors or industrial scale. Moreover, sensitivity analysis for systematically varying process conditions is easily feasible. The developed multi-scale modelling approach finally gives the opportunity to predict and design LDPE processing behavior simply based on process conditions such as feed streams and inlet temperatures and pressures.

Keywords: low-density polyethylene, multi-scale modelling, polymer properties, reaction engineering, rheology

Procedia PDF Downloads 115
15530 Teachers’ Reactions, Learning, Organizational Support, and Use of Lesson Study for Transformative Assessment

Authors: Melaku Takele Abate, Abbi Lemma Wodajo, Adula Bekele Hunde

Abstract:

This study aimed at exploring mathematics teachers' reactions, learning, school leaders’ support, and use of the Lesson Study for Transformative Assessment (LSforTA) program ideas in practice. The LSforTA program was new, and therefore, a local and grounded approach was needed to examine teachers’ knowledge and skills acquired using LSforTA. So, a design-based research approach was selected to evaluate and refine the LSforTA approach. The results showed that LSforTA increased teachers' knowledge and use of different levels of mathematics assessment tasks. The program positively affected teachers' practices of transformative assessment and enhanced their knowledge and skills in assessing students in a transformative way. The paper concludes how the LSforTA procedures were adapted in response to this evaluation and provides suggestions for future development and research.

Keywords: classroom assessment, feedback practices, lesson study, mathematics, design-based research

Procedia PDF Downloads 46
15529 A Religious Book Translation by Pragmatic Approach: The Vajrachedika-Prajna-Paramita Sutra

Authors: Yoon-Cheol Park

Abstract:

This research focuses on examining the Chinese character-Korean language translation of the Vajrachedika-prajna-paramita sutra by a pragmatic approach. The background of this research is that there were no previous researches which looked into the Vajrachedika-prajna-paramita translation by pragmatic approach until now. Even though it is composed of conversational structures between Buddha and his disciple unlike other Buddhist sutras, most of its translation could find the traces to have pursued literal translation and still has now overlooked pragmatic elements in it. Accordingly, it is meaningful to examine the messages through speaker and hearer relation and between speaker intention and utterance meaning. Practically, the Vajrachedika-prajna-paramita sutra includes pragmatic elements, such as speech acts, presupposition, conversational implicature, the cooperative principle and politeness. First, speech acts in its sutra text show the translation to reveal obvious performance meanings of language to the target text. And presupposition in their dialogues is conveyed by paraphrasing or substituting abstruse language with easy expressions. Conversational implicature in utterances makes it possible to understand the meanings of holy words by relying on utterance contexts. In particular, relevance results in an increase of readability in the translation owing to previous utterance contexts. Finally, politeness in the target text is conveyed with natural stylistics through the honorific system of the Korean language. These elements mean that the pragmatic approach can function as a useful device in conveying holy words in a specific, practical and direct way depending on utterance contexts. Therefore, we expect that taking a pragmatic approach in translating the Vajrachedika-prajna-paramita sutra will provide a theoretical foundation for seeking better translation methods than the literal translations of the past. And it implies that the translation of Buddhist sutra needs to convey messages by translation methods which take into account the characteristic of sutra text like the Vajrachedika-prajna-paramita.

Keywords: buddhist sutra, Chinese character-Korean language translation, pragmatic approach, utterance context

Procedia PDF Downloads 394
15528 Towards an Indigenous Language Policy for National Integration

Authors: Odoh Dickson Akpegi

Abstract:

The paper is about the need for an indigenous language in order to meaningfully harness both our human and material resources for the nation’s integration. It then examines the notty issue of the national language question and advocates a piece meal approach in solving the problem. This approach allows for the development and use of local languages in minority areas, especially in Benue State, as a way of preparing them for consideration as possible replacement for English language as Nigeria’s national or official language. Finally, an arrangement to follow to prepare the languages for such competition at the national level is presented.

Keywords: indigenous language, English language, official language, National integration

Procedia PDF Downloads 543
15527 Body Composition Analyser Parameters and Their Comparison with Manual Measurements

Authors: I. Karagjozova, B. Dejanova, J. Pluncevic, S. Petrovska, V. Antevska, L. Todorovska

Abstract:

Introduction: Medical checking assessment is important in sports medicine. To follow the health condition in subjects who perform sports, body composition parameters, such as intracellular water, extracellular water, protein and mineral content, muscle and fat mass might be useful. The aim of the study was to show available parameters and to compare them to manual assessment. Material and methods: A number of 20 subjects (14 male and 6 female) at age of 20±2 years were determined in the study, 5 performed recreational sports, while others were professional ones. The mean height was 175±7 cm, the mean weight was 72±9 cm, and the body mass index (BMI) was 23±2 kg/m2. The measured compartments were as following: intracellular water (IW), extracellular water (EW), protein component (PC), mineral component (MC), skeletal muscle mass (SMM) and body fat mass (BFM). Lean balance were examined for right and left arm (LA), trunk (T), right leg (RL) and left leg (LL). The comparison was made between the calculation derived by manual made measurements, using Matejka formula and parameters obtained by body composition analyzer (BCA) - Inbody 720 BCA Biospace. Used parameters for the comparison were muscle mass (SMM), body fat mass (BFM). Results: BCA obtained values were for: IW - 22.6±5L, EW - 13.5±2 L, PC - 9.8±0.9 kg, MC - 3.5±0.3, SMM - 27±3 kg, BFM - 13.8±4 kg. Lean balance showed following values for: RA - 2.45±0.2 kg, LA - 2.37±0.4, T - 20.9±5 kg, RL - 7.43±1 kg, and LL - 7.49 ±1.5 kg. SMM showed statistical difference between manual obtained value, 51±01% to BCA parameter 45.5±3% (p<0.001). Manual obtained values for BFM was lower (17±2%) than BCA obtained one, 19.5±5.9% (p<0.02). Discussion: The obtained results showed appropriate values for the examined age, regarding to all examined parameters which contribute to overview the body compartments, important for sport performing. Due to comparison between the manual and BCA assessment, we may conclude that manual measurements may differ from the certain ones, which is confirmed by statistical significance.

Keywords: athletes, body composition, bio electrical impedance, sports medicine

Procedia PDF Downloads 469
15526 Support Vector Regression for Retrieval of Soil Moisture Using Bistatic Scatterometer Data at X-Band

Authors: Dileep Kumar Gupta, Rajendra Prasad, Pradeep Kumar, Varun Narayan Mishra, Ajeet Kumar Vishwakarma, Prashant K. Srivastava

Abstract:

An approach was evaluated for the retrieval of soil moisture of bare soil surface using bistatic scatterometer data in the angular range of 200 to 700 at VV- and HH- polarization. The microwave data was acquired by specially designed X-band (10 GHz) bistatic scatterometer. The linear regression analysis was done between scattering coefficients and soil moisture content to select the suitable incidence angle for retrieval of soil moisture content. The 250 incidence angle was found more suitable. The support vector regression analysis was used to approximate the function described by the input-output relationship between the scattering coefficient and corresponding measured values of the soil moisture content. The performance of support vector regression algorithm was evaluated by comparing the observed and the estimated soil moisture content by statistical performance indices %Bias, root mean squared error (RMSE) and Nash-Sutcliffe Efficiency (NSE). The values of %Bias, root mean squared error (RMSE) and Nash-Sutcliffe Efficiency (NSE) were found 2.9451, 1.0986, and 0.9214, respectively at HH-polarization. At VV- polarization, the values of %Bias, root mean squared error (RMSE) and Nash-Sutcliffe Efficiency (NSE) were found 3.6186, 0.9373, and 0.9428, respectively.

Keywords: bistatic scatterometer, soil moisture, support vector regression, RMSE, %Bias, NSE

Procedia PDF Downloads 415
15525 CO2 Emission and Cost Optimization of Reinforced Concrete Frame Designed by Performance Based Design Approach

Authors: Jin Woo Hwang, Byung Kwan Oh, Yousok Kim, Hyo Seon Park

Abstract:

As greenhouse effect has been recognized as serious environmental problem of the world, interests in carbon dioxide (CO2) emission which comprises major part of greenhouse gas (GHG) emissions have been increased recently. Since construction industry takes a relatively large portion of total CO2 emissions of the world, extensive studies about reducing CO2 emissions in construction and operation of building have been carried out after the 2000s. Also, performance based design (PBD) methodology based on nonlinear analysis has been robustly developed after Northridge Earthquake in 1994 to assure and assess seismic performance of building more exactly because structural engineers recognized that prescriptive code based design approach cannot address inelastic earthquake responses directly and assure performance of building exactly. Although CO2 emissions and PBD approach are recent rising issues on construction industry and structural engineering, there were few or no researches considering these two issues simultaneously. Thus, the objective of this study is to minimize the CO2 emissions and cost of building designed by PBD approach in structural design stage considering structural materials. 4 story and 4 span reinforced concrete building optimally designed to minimize CO2 emissions and cost of building and to satisfy specific seismic performance (collapse prevention in maximum considered earthquake) of building satisfying prescriptive code regulations using non-dominated sorting genetic algorithm-II (NSGA-II). Optimized design result showed that minimized CO2 emissions and cost of building were acquired satisfying specific seismic performance. Therefore, the methodology proposed in this paper can be used to reduce both CO2 emissions and cost of building designed by PBD approach.

Keywords: CO2 emissions, performance based design, optimization, sustainable design

Procedia PDF Downloads 399
15524 Sustainable Approach in Textile and Apparel Industry: Case Study Applied to a Medium Enterprise

Authors: Maged Kamal

Abstract:

Previous research papers have suggested that enhancing the environmental performance in textiles and apparel industry would affect positively on the overall enterprise competitiveness. However, there is a gap in the literature regarding simplifying the available theory to get it practically implemented with more confidence of the expected results, especially for small and medium enterprises. The aim of this paper is to simplify and best use of the concerned international norms to produce a systematic approach that could be used as a guideline for practical application of the main sustainable principles in medium size textile business. The increasing in efficiency which has been resulted from the implementation of the suggested approach/model originated from reduction in raw materials usage, energy, and water savings, in addition to the risk reduction for the people and the environment. The practical case study has been implemented in a textile factory producing knitted fabrics, readymade garments, dyed and printed fabrics. The results were analyzed to examine the effect of the suggested change on the enterprise profitability.

Keywords: apparel industry, environmental management, sustainability, textiles

Procedia PDF Downloads 278
15523 Non-Linear Assessment of Chromatographic Lipophilicity of Selected Steroid Derivatives

Authors: Milica Karadžić, Lidija Jevrić, Sanja Podunavac-Kuzmanović, Strahinja Kovačević, Anamarija Mandić, Aleksandar Oklješa, Andrea Nikolić, Marija Sakač, Katarina Penov Gaši

Abstract:

Using chemometric approach, the relationships between the chromatographic lipophilicity and in silico molecular descriptors for twenty-nine selected steroid derivatives were studied. The chromatographic lipophilicity was predicted using artificial neural networks (ANNs) method. The most important in silico molecular descriptors were selected applying stepwise selection (SS) paired with partial least squares (PLS) method. Molecular descriptors with satisfactory variable importance in projection (VIP) values were selected for ANN modeling. The usefulness of generated models was confirmed by detailed statistical validation. High agreement between experimental and predicted values indicated that obtained models have good quality and high predictive ability. Global sensitivity analysis (GSA) confirmed the importance of each molecular descriptor used as an input variable. High-quality networks indicate a strong non-linear relationship between chromatographic lipophilicity and used in silico molecular descriptors. Applying selected molecular descriptors and generated ANNs the good prediction of chromatographic lipophilicity of the studied steroid derivatives can be obtained. This article is based upon work from COST Actions (CM1306 and CA15222), supported by COST (European Cooperation and Science and Technology).

Keywords: artificial neural networks, chemometrics, global sensitivity analysis, liquid chromatography, steroids

Procedia PDF Downloads 331
15522 Event Driven Dynamic Clustering and Data Aggregation in Wireless Sensor Network

Authors: Ashok V. Sutagundar, Sunilkumar S. Manvi

Abstract:

Energy, delay and bandwidth are the prime issues of wireless sensor network (WSN). Energy usage optimization and efficient bandwidth utilization are important issues in WSN. Event triggered data aggregation facilitates such optimal tasks for event affected area in WSN. Reliable delivery of the critical information to sink node is also a major challenge of WSN. To tackle these issues, we propose an event driven dynamic clustering and data aggregation scheme for WSN that enhances the life time of the network by minimizing redundant data transmission. The proposed scheme operates as follows: (1) Whenever the event is triggered, event triggered node selects the cluster head. (2) Cluster head gathers data from sensor nodes within the cluster. (3) Cluster head node identifies and classifies the events out of the collected data using Bayesian classifier. (4) Aggregation of data is done using statistical method. (5) Cluster head discovers the paths to the sink node using residual energy, path distance and bandwidth. (6) If the aggregated data is critical, cluster head sends the aggregated data over the multipath for reliable data communication. (7) Otherwise aggregated data is transmitted towards sink node over the single path which is having the more bandwidth and residual energy. The performance of the scheme is validated for various WSN scenarios to evaluate the effectiveness of the proposed approach in terms of aggregation time, cluster formation time and energy consumed for aggregation.

Keywords: wireless sensor network, dynamic clustering, data aggregation, wireless communication

Procedia PDF Downloads 437
15521 Self-Efficacy Perceptions of Pre-Service Art and Music Teachers towards the Use of Information and Communication Technologies

Authors: Agah Tugrul Korucu

Abstract:

Information and communication technologies have become an important part of our daily lives with significant investments in technology in the 21st century. Individuals are more willing to design and implement computer-related activities, and they are the main component of computer self-efficacy and self-efficacy related to the fact that the increase in information technology, with operations in parallel with these activities more successful. The Self-efficacy level is a significant factor which determines how individuals act in events, situations and difficult processes. It is observed that individuals with higher self-efficacy perception of computers who encounter problems related to computer use overcome them more easily. Therefore, this study aimed to examine self-efficacy perceptions of pre-service art and music teachers towards the use of information and communication technologies in terms of different variables. Research group consists of 60 pre-service teachers who are studying at Necmettin Erbakan University Ahmet Keleşoğlu Faculty of Education Art and Music department. As data collection tool of the study; “personal information form” developed by the researcher and used to collect demographic data and "the perception scale related to self-efficacy of informational technology" are used. The scale is 5-point Likert-type scale. It consists of 27 items. The Kaiser-Meyer-Olkin (KMO) sample compliance value is found 0.959. The Cronbach alpha reliability coefficient of the scale is found to be 0.97. computer-based statistical software package (SPSS 21.0) is used in order to analyze the data collected by data collection tools; descriptive statistics, t-test, analysis of variance are used as statistical techniques.

Keywords: self-efficacy perceptions, teacher candidate, information and communication technologies, art teacher

Procedia PDF Downloads 317
15520 Evaluating the Factors Controlling the Hydrochemistry of Gaza Coastal Aquifer Using Hydrochemical and Multivariate Statistical Analysis

Authors: Madhat Abu Al-Naeem, Ismail Yusoff, Ng Tham Fatt, Yatimah Alias

Abstract:

Groundwater in Gaza strip is increasingly being exposed to anthropic and natural factors that seriously impacted the groundwater quality. Physiochemical data of groundwater can offer important information on changes in groundwater quality that can be useful in improving water management tactics. An integrative hydrochemical and statistical techniques (Hierarchical cluster analysis (HCA) and factor analysis (FA)) have been applied on the existence ten physiochemical data of 84 samples collected in (2000/2001) using STATA, AquaChem, and Surfer softwares to: 1) Provide valuable insight into the salinization sources and the hydrochemical processes controlling the chemistry of groundwater. 2) Differentiate the influence of natural processes and man-made activities. The recorded large diversity in water facies with dominance Na-Cl type that reveals a highly saline aquifer impacted by multiple complex hydrochemical processes. Based on WHO standards, only (15.5%) of the wells were suitable for drinking. HCA yielded three clusters. Cluster 1 is the highest in salinity, mainly due to the impact of Eocene saline water invasion mixed with human inputs. Cluster 2 is the lowest in salinity also due to Eocene saline water invasion but mixed with recent rainfall recharge and limited carbonate dissolution and nitrate pollution. Cluster 3 is similar in salinity to Cluster 2, but with a high diversity of facies due to the impact of many sources of salinity as sea water invasion, carbonate dissolution and human inputs. Factor analysis yielded two factors accounting for 88% of the total variance. Factor 1 (59%) is a salinization factor demonstrating the mixing contribution of natural saline water with human inputs. Factor 2 measure the hardness and pollution which explained 29% of the total variance. The negative relationship between the NO3- and pH may reveal a denitrification process in a heavy polluted aquifer recharged by a limited oxygenated rainfall. Multivariate statistical analysis combined with hydrochemical analysis indicate that the main factors controlling groundwater chemistry were Eocene saline invasion, seawater invasion, sewage invasion and rainfall recharge and the main hydrochemical processes were base ion and reverse ion exchange processes with clay minerals (water rock interactions), nitrification, carbonate dissolution and a limited denitrification process.

Keywords: dendrogram and cluster analysis, water facies, Eocene saline invasion and sea water invasion, nitrification and denitrification

Procedia PDF Downloads 349
15519 Progressive Participatory Observation Applied to Priority Neighbourhoods

Authors: Serge Rohmer

Abstract:

This paper proposes a progressive participatory observation that can be used as a sociological investigation within communities. The usefulness of participant observation in sociological projects is first asserted, particularly in an urban context. Competencies, know-how and interpersonal skills are then explained before to detail the progressive approach, consisting of four levels of observation. The progressive participatory observation is applied to an experimental project to set up a permaculture urban micro-farm with residents of a priority neighbourhood. Feedback on the experiment has identified several key recommendations for implementing the approach.

Keywords: participatory observation, observation scale, priority neighbourhood, urban sociology

Procedia PDF Downloads 13
15518 Implementation of a Lattice Boltzmann Method for Pulsatile Flow with Moment Based Boundary Condition

Authors: Zainab A. Bu Sinnah, David I. Graham

Abstract:

The Lattice Boltzmann Method has been developed and used to simulate both steady and unsteady fluid flow problems such as turbulent flows, multiphase flow and flows in the vascular system. As an example, the study of blood flow and its properties can give a greater understanding of atherosclerosis and the flow parameters which influence this phenomenon. The blood flow in the vascular system is driven by a pulsating pressure gradient which is produced by the heart. As a very simple model of this, we simulate plane channel flow under periodic forcing. This pulsatile flow is essentially the standard Poiseuille flow except that the flow is driven by the periodic forcing term. Moment boundary conditions, where various moments of the particle distribution function are specified, are applied at solid walls. We used a second-order single relaxation time model and investigated grid convergence using two distinct approaches. In the first approach, we fixed both Reynolds and Womersley numbers and varied relaxation time with grid size. In the second approach, we fixed the Womersley number and relaxation time. The expected second-order convergence was obtained for the second approach. For the first approach, however, the numerical method converged, but not necessarily to the appropriate analytical result. An explanation is given for these observations.

Keywords: Lattice Boltzmann method, single relaxation time, pulsatile flow, moment based boundary condition

Procedia PDF Downloads 221
15517 A Socio-Technical Approach to Cyber-Risk Assessment

Authors: Kitty Kioskli, Nineta Polemi

Abstract:

Evaluating the levels of cyber-security risks within an enterprise is most important in protecting its information system, services and all its digital assets against security incidents (e.g. accidents, malicious acts, massive cyber-attacks). The existing risk assessment methodologies (e.g. eBIOS, OCTAVE, CRAMM, NIST-800) adopt a technical approach considering as attack factors only the capability, intention and target of the attacker, and not paying attention to the attacker’s psychological profile and personality traits. In this paper, a socio-technical approach is proposed in cyber risk assessment, in order to achieve more realistic risk estimates by considering the personality traits of the attackers. In particular, based upon principles from investigative psychology and behavioural science, a multi-dimensional, extended, quantifiable model for an attacker’s profile is developed, which becomes an additional factor in the cyber risk level calculation.

Keywords: attacker, behavioural models, cyber risk assessment, cybersecurity, human factors, investigative psychology, ISO27001, ISO27005

Procedia PDF Downloads 147
15516 Effect of 3-Dimensional Knitted Spacer Fabrics Characteristics on Its Thermal and Compression Properties

Authors: Veerakumar Arumugam, Rajesh Mishra, Jiri Militky, Jana Salacova

Abstract:

The thermo-physiological comfort and compression properties of knitted spacer fabrics have been evaluated by varying the different spacer fabric parameters. Air permeability and water vapor transmission of the fabrics were measured using the Textest FX-3300 air permeability tester and PERMETEST. Then thermal behavior of fabrics was obtained by Thermal conductivity analyzer and overall moisture management capacity was evaluated by moisture management tester. Spacer Fabrics compression properties were also tested using Kawabata Evaluation System (KES-FB3). In the KES testing, the compression resilience, work of compression, linearity of compression and other parameters were calculated from the pressure-thickness curves. Analysis of Variance (ANOVA) was performed using new statistical software named QC expert trilobite and Darwin in order to compare the influence of different fabric parameters on thermo-physiological and compression behavior of samples. This study established that the raw materials, type of spacer yarn, density, thickness and tightness of surface layer have significant influence on both thermal conductivity and work of compression in spacer fabrics. The parameter which mainly influence on the water vapor permeability of these fabrics is the properties of raw material i.e. the wetting and wicking properties of fibers. The Pearson correlation between moisture capacity of the fabrics and water vapour permeability was found using statistical software named QC expert trilobite and Darwin. These findings are important requirements for the further designing of clothing for extreme environmental conditions.

Keywords: 3D spacer fabrics, thermal conductivity, moisture management, work of compression (WC), resilience of compression (RC)

Procedia PDF Downloads 531
15515 A Non-parametric Clustering Approach for Multivariate Geostatistical Data

Authors: Francky Fouedjio

Abstract:

Multivariate geostatistical data have become omnipresent in the geosciences and pose substantial analysis challenges. One of them is the grouping of data locations into spatially contiguous clusters so that data locations within the same cluster are more similar while clusters are different from each other, in some sense. Spatially contiguous clusters can significantly improve the interpretation that turns the resulting clusters into meaningful geographical subregions. In this paper, we develop an agglomerative hierarchical clustering approach that takes into account the spatial dependency between observations. It relies on a dissimilarity matrix built from a non-parametric kernel estimator of the spatial dependence structure of data. It integrates existing methods to find the optimal cluster number and to evaluate the contribution of variables to the clustering. The capability of the proposed approach to provide spatially compact, connected and meaningful clusters is assessed using bivariate synthetic dataset and multivariate geochemical dataset. The proposed clustering method gives satisfactory results compared to other similar geostatistical clustering methods.

Keywords: clustering, geostatistics, multivariate data, non-parametric

Procedia PDF Downloads 471
15514 The Significance of a Well-Defined Systematic Approach in Risk Management for Construction Projects within Oil Industry

Authors: Batool Ismaeel, Umair Farooq, Saad Mushtaq

Abstract:

Construction projects in the oil industry can be very complex, having unknown outcomes and uncertainties that cannot be easily predicted. Each project has its unique risks generated by a number of factors which, if not controlled, will impact the successful completion of the project mainly in terms of schedule, cost, quality, and safety. This paper highlights the historic risks associated with projects in the south and east region of Kuwait Oil Company (KOC) collated from the company’s lessons learned database. Starting from Contract Award through to handover of the project to the Asset owner, the gaps in project execution in terms of managing risk will be brought to discussion and where a well-defined systematic approach in project risk management reflecting many claims, change of scope, exceeding budget, delays in engineering phase as well as in the procurement and fabrication of long lead items should be adopted. This study focuses on a proposed feasible approach in risk management for engineering, procurement and construction (EPC) level projects including the various stakeholders involved in executing the works from International to local contractors and vendors in KOC. The proposed approach covers the areas categorized into organizational, design, procurement, construction, pre-commissioning, commissioning and project management in which the risks are identified and require management and mitigation. With the effective deployment and implementation of the proposed risk management system and the consideration of it as a vital key in achieving the project’s target, the outcomes will be more predictable in the future, and the risk triggers will be managed and controlled. The correct resources can be allocated on a timely basis for the company for avoiding any unpredictable outcomes during the execution of the project. It is recommended in this paper to apply this risk management approach as an integral part of project management and investigate further in the future, the effectiveness of this proposed system for newly awarded projects and compare the same with those projects of similar budget/complexity that have not applied this approach to risk management.

Keywords: construction, project completion, risk management, uncertainties

Procedia PDF Downloads 143
15513 Prevalence of Human Papillomavirus in Squamous Intraepithelial Lesions and Cervical Cancer in Women of the North of Chihuahua, Mexico

Authors: Estefania Ponce-Amaya, Ana Lidia Arellano-Ortiz, Cecilia Diaz-Hernandez, Jose Alberto Lopez-Diaz, Antonio De La Mora-Covarrubias, Claudia Lucia Vargas-Requena, Mauricio Salcedo-Vargas, Florinda Jimenez-Vega

Abstract:

Cervical Cancer (CC) is the second leading cause of death among women worldwide and it had been associated with a persistent infection of human papillomavirus (HPV). The goal of the current study was to identify the prevalence of HPV infection in women with abnormal Pap smear who were attended at Dysplasia Clinic of Ciudad Juarez, Mexico. Methods: Cervical samples from 146 patients, who attended the Colposcopy Clinic at Sanitary Jurisdiction II of Cd Juarez, were collected for histopathology and molecular study. DNA was isolated for the HPV detection by Polymerase Chain Reaction (PCR) using MY09/011 and GP5/6 primers. The associated risk factors were assessed by a questionnaire. The statistical analysis was performed by ANOVA, using EpiINFO V7 software. Results: HPV infection was present in 142 patients (97.3 %). The prevalence of HPV infection was distributed in a 96% of all evaluated groups, low-grade squamous intraepithelial lesion (LSIL), high-grade squamous intraepithelial lesion (HISIL) and CC. We found a statistical significance (α = <0.05) between gestation and number of births as risk factors. The median values showed an ascending tend according with the lesion progression. However, CC showed a statistically significant difference with respect to the pre-carcinogenic stages. Conclusions: In these Mexican patients exists a high prevalence of HPV infection, and for that reason, we are studying the most prevalent HPV genotypes in this population.

Keywords: cervical cancer, HPV, prevalence hpv, squamous intraepithelial lesion

Procedia PDF Downloads 307
15512 Supporting Women's Economic Development in Rural Papua New Guinea

Authors: Katja Mikhailovich, Barbara Pamphilon

Abstract:

Farmer training in Papua New Guinea has focused mainly on technology transfer approaches. This has primarily benefited men and often excluded women whose literacy, low education and role in subsistence crops has precluded participation in formal training. The paper discusses an approach that uses both a brokerage model of agricultural extension to link smallholders with private sector agencies and an innovative family team’s approach that aims to support the economic empowerment of women in families and encourages sustainable and gender equitable farming and business practices.

Keywords: women, economic development, agriculture, training

Procedia PDF Downloads 381
15511 Cross-Validation of the Data Obtained for ω-6 Linoleic and ω-3 α-Linolenic Acids Concentration of Hemp Oil Using Jackknife and Bootstrap Resampling

Authors: Vibha Devi, Shabina Khanam

Abstract:

Hemp (Cannabis sativa) possesses a rich content of ω-6 linoleic and ω-3 linolenic essential fatty acid in the ratio of 3:1, which is a rare and most desired ratio that enhances the quality of hemp oil. These components are beneficial for the development of cell and body growth, strengthen the immune system, possess anti-inflammatory action, lowering the risk of heart problem owing to its anti-clotting property and a remedy for arthritis and various disorders. The present study employs supercritical fluid extraction (SFE) approach on hemp seed at various conditions of parameters; temperature (40 - 80) °C, pressure (200 - 350) bar, flow rate (5 - 15) g/min, particle size (0.430 - 1.015) mm and amount of co-solvent (0 - 10) % of solvent flow rate through central composite design (CCD). CCD suggested 32 sets of experiments, which was carried out. As SFE process includes large number of variables, the present study recommends the application of resampling techniques for cross-validation of the obtained data. Cross-validation refits the model on each data to achieve the information regarding the error, variability, deviation etc. Bootstrap and jackknife are the most popular resampling techniques, which create a large number of data through resampling from the original dataset and analyze these data to check the validity of the obtained data. Jackknife resampling is based on the eliminating one observation from the original sample of size N without replacement. For jackknife resampling, the sample size is 31 (eliminating one observation), which is repeated by 32 times. Bootstrap is the frequently used statistical approach for estimating the sampling distribution of an estimator by resampling with replacement from the original sample. For bootstrap resampling, the sample size is 32, which was repeated by 100 times. Estimands for these resampling techniques are considered as mean, standard deviation, variation coefficient and standard error of the mean. For ω-6 linoleic acid concentration, mean value was approx. 58.5 for both resampling methods, which is the average (central value) of the sample mean of all data points. Similarly, for ω-3 linoleic acid concentration, mean was observed as 22.5 through both resampling. Variance exhibits the spread out of the data from its mean. Greater value of variance exhibits the large range of output data, which is 18 for ω-6 linoleic acid (ranging from 48.85 to 63.66 %) and 6 for ω-3 linoleic acid (ranging from 16.71 to 26.2 %). Further, low value of standard deviation (approx. 1 %), low standard error of the mean (< 0.8) and low variance coefficient (< 0.2) reflect the accuracy of the sample for prediction. All the estimator value of variance coefficients, standard deviation and standard error of the mean are found within the 95 % of confidence interval.

Keywords: resampling, supercritical fluid extraction, hemp oil, cross-validation

Procedia PDF Downloads 131
15510 Multiscale Modeling of Damage in Textile Composites

Authors: Jaan-Willem Simon, Bertram Stier, Brett Bednarcyk, Evan Pineda, Stefanie Reese

Abstract:

Textile composites, in which the reinforcing fibers are woven or braided, have become very popular in numerous applications in aerospace, automotive, and maritime industry. These textile composites are advantageous due to their ease of manufacture, damage tolerance, and relatively low cost. However, physics-based modeling of the mechanical behavior of textile composites is challenging. Compared to their unidirectional counterparts, textile composites introduce additional geometric complexities, which cause significant local stress and strain concentrations. Since these internal concentrations are primary drivers of nonlinearity, damage, and failure within textile composites, they must be taken into account in order for the models to be predictive. The macro-scale approach to modeling textile-reinforced composites treats the whole composite as an effective, homogenized material. This approach is very computationally efficient, but it cannot be considered predictive beyond the elastic regime because the complex microstructural geometry is not considered. Further, this approach can, at best, offer a phenomenological treatment of nonlinear deformation and failure. In contrast, the mesoscale approach to modeling textile composites explicitly considers the internal geometry of the reinforcing tows, and thus, their interaction, and the effects of their curved paths can be modeled. The tows are treated as effective (homogenized) materials, requiring the use of anisotropic material models to capture their behavior. Finally, the micro-scale approach goes one level lower, modeling the individual filaments that constitute the tows. This paper will compare meso- and micro-scale approaches to modeling the deformation, damage, and failure of textile-reinforced polymer matrix composites. For the mesoscale approach, the woven composite architecture will be modeled using the finite element method, and an anisotropic damage model for the tows will be employed to capture the local nonlinear behavior. For the micro-scale, two different models will be used, the one being based on the finite element method, whereas the other one makes use of an embedded semi-analytical approach. The goal will be the comparison and evaluation of these approaches to modeling textile-reinforced composites in terms of accuracy, efficiency, and utility.

Keywords: multiscale modeling, continuum damage model, damage interaction, textile composites

Procedia PDF Downloads 339