Search results for: forensic autopsy data
22538 Computational Fluid Dynamics Simulations and Analysis of Air Bubble Rising in a Column of Liquid
Authors: Baha-Aldeen S. Algmati, Ahmed R. Ballil
Abstract:
Multiphase flows occur widely in many engineering and industrial processes as well as in the environment we live in. In particular, bubbly flows are considered to be crucial phenomena in fluid flow applications and can be studied and analyzed experimentally, analytically, and computationally. In the present paper, the dynamic motion of an air bubble rising within a column of liquid is numerically simulated using an open-source CFD modeling tool 'OpenFOAM'. An interface tracking numerical algorithm called MULES algorithm, which is built-in OpenFOAM, is chosen to solve an appropriate mathematical model based on the volume of fluid (VOF) numerical method. The bubbles initially have a spherical shape and starting from rest in the stagnant column of liquid. The algorithm is initially verified against numerical results and is also validated against available experimental data. The comparison revealed that this algorithm provides results that are in a very good agreement with the 2D numerical data of other CFD codes. Also, the results of the bubble shape and terminal velocity obtained from the 3D numerical simulation showed a very good qualitative and quantitative agreement with the experimental data. The simulated rising bubbles yield a very small percentage of error in the bubble terminal velocity compared with the experimental data. The obtained results prove the capability of OpenFOAM as a powerful tool to predict the behavior of rising characteristics of the spherical bubbles in the stagnant column of liquid. This will pave the way for a deeper understanding of the phenomenon of the rise of bubbles in liquids.Keywords: CFD simulations, multiphase flows, OpenFOAM, rise of bubble, volume of fluid method, VOF
Procedia PDF Downloads 12322537 Estimating Groundwater Seepage Rates: Case Study at Zegveld, Netherlands
Authors: Wondmyibza Tsegaye Bayou, Johannes C. Nonner, Joost Heijkers
Abstract:
This study aimed to identify and estimate dynamic groundwater seepage rates using four comparative methods; the Darcian approach, the water balance approach, the tracer method, and modeling. The theoretical background to these methods is put together in this study. The methodology was applied to a case study area at Zegveld following the advice of the Water Board Stichtse Rijnlanden. Data collection has been from various offices and a field campaign in the winter of 2008/09. In this complex confining layer of the study area, the location of the phreatic groundwater table is at a shallow depth compared to the piezometric water level. Data were available for the model years 1989 to 2000 and winter 2008/09. The higher groundwater table shows predominately-downward seepage in the study area. Results of the study indicated that net recharge to the groundwater table (precipitation excess) and the ditch system are the principal sources for seepage across the complex confining layer. Especially in the summer season, the contribution from the ditches is significant. Water is supplied from River Meije through a pumping system to meet the ditches' water demand. The groundwater seepage rate was distributed unevenly throughout the study area at the nature reserve averaging 0.60 mm/day for the model years 1989 to 2000 and 0.70 mm/day for winter 2008/09. Due to data restrictions, the seepage rates were mainly determined based on the Darcian method. Furthermore, the water balance approach and the tracer methods are applied to compute the flow exchange within the ditch system. The site had various validated groundwater levels and vertical flow resistance data sources. The phreatic groundwater level map compared with TNO-DINO groundwater level data values overestimated the groundwater level depth by 28 cm. The hydraulic resistance values obtained based on the 3D geological map compared with the TNO-DINO data agreed with the model values before calibration. On the other hand, the calibrated model significantly underestimated the downward seepage in the area compared with the field-based computations following the Darcian approach.Keywords: groundwater seepage, phreatic water table, piezometric water level, nature reserve, Zegveld, The Netherlands
Procedia PDF Downloads 8522536 Incorporating Spatial Transcriptome Data into Ligand-Receptor Analyses to Discover Regional Activation in Cells
Authors: Eric Bang
Abstract:
Interactions between receptors and ligands are crucial for many essential biological processes, including neurotransmission and metabolism. Ligand-receptor analyses that examine cell behavior and interactions often utilize cell type-specific RNA expressions from single-cell RNA sequencing (scRNA-seq) data. Using CellPhoneDB, a public repository consisting of ligands, receptors, and ligand-receptor interactions, the cell-cell interactions were explored in a specific scRNA-seq dataset from kidney tissue and portrayed the results with dot plots and heat maps. Depending on the type of cell, each ligand-receptor pair was aligned with the interacting cell type and calculated the positori probabilities of these associations, with corresponding P values reflecting average expression values between the triads and their significance. Using single-cell data (sample kidney cell references), genes in the dataset were cross-referenced with ones in the existing CellPhoneDB dataset. For example, a gene such as Pleiotrophin (PTN) present in the single-cell data also needed to be present in the CellPhoneDB dataset. Using the single-cell transcriptomics data via slide-seq and reference data, the CellPhoneDB program defines cell types and plots them in different formats, with the two main ones being dot plots and heat map plots. The dot plot displays derived measures of the cell to cell interaction scores and p values. For the dot plot, each row shows a ligand-receptor pair, and each column shows the two interacting cell types. CellPhoneDB defines interactions and interaction levels from the gene expression level, so since the p-value is on a -log10 scale, the larger dots represent more significant interactions. By performing an interaction analysis, a significant interaction was discovered for myeloid and T-cell ligand-receptor pairs, including those between Secreted Phosphoprotein 1 (SPP1) and Fibronectin 1 (FN1), which is consistent with previous findings. It was proposed that an effective protocol would involve a filtration step where cell types would be filtered out, depending on which ligand-receptor pair is activated in that part of the tissue, as well as the incorporation of the CellPhoneDB data in a streamlined workflow pipeline. The filtration step would be in the form of a Python script that expedites the manual process necessary for dataset filtration. Being in Python allows it to be integrated with the CellPhoneDB dataset for future workflow analysis. The manual process involves filtering cell types based on what ligand/receptor pair is activated in kidney cells. One limitation of this would be the fact that some pairings are activated in multiple cells at a time, so the manual manipulation of the data is reflected prior to analysis. Using the filtration script, accurate sorting is incorporated into the CellPhoneDB database rather than waiting until the output is produced and then subsequently applying spatial data. It was envisioned that this would reveal wherein the cell various ligands and receptors are interacting with different cell types, allowing for easier identification of which cells are being impacted and why, for the purpose of disease treatment. The hope is this new computational method utilizing spatially explicit ligand-receptor association data can be used to uncover previously unknown specific interactions within kidney tissue.Keywords: bioinformatics, Ligands, kidney tissue, receptors, spatial transcriptome
Procedia PDF Downloads 13922535 Geographic Information System (GIS) for Structural Typology of Buildings
Authors: Néstor Iván Rojas, Wilson Medina Sierra
Abstract:
Managing spatial information is described through a Geographic Information System (GIS), for some neighborhoods in the city of Tunja, in relation to the structural typology of the buildings. The use of GIS provides tools that facilitate the capture, processing, analysis and dissemination of cartographic information, product quality evaluation of the classification of buildings. Allows the development of a method that unifies and standardizes processes information. The project aims to generate a geographic database that is useful to the entities responsible for planning and disaster prevention and care for vulnerable populations, also seeks to be a basis for seismic vulnerability studies that can contribute in a study of urban seismic microzonation. The methodology consists in capturing the plat including road naming, neighborhoods, blocks and buildings, to which were added as attributes, the product of the evaluation of each of the housing data such as the number of inhabitants and classification, year of construction, the predominant structural systems, the type of mezzanine board and state of favorability, the presence of geo-technical problems, the type of cover, the use of each building, damage to structural and non-structural elements . The above data are tabulated in a spreadsheet that includes cadastral number, through which are systematically included in the respective building that also has that attribute. Geo-referenced data base is obtained, from which graphical outputs are generated, producing thematic maps for each evaluated data, which clearly show the spatial distribution of the information obtained. Using GIS offers important advantages for spatial information management and facilitates consultation and update. Usefulness of the project is recognized as a basis for studies on issues of planning and prevention.Keywords: microzonation, buildings, geo-processing, cadastral number
Procedia PDF Downloads 33422534 Improving Cell Type Identification of Single Cell Data by Iterative Graph-Based Noise Filtering
Authors: Annika Stechemesser, Rachel Pounds, Emma Lucas, Chris Dawson, Julia Lipecki, Pavle Vrljicak, Jan Brosens, Sean Kehoe, Jason Yap, Lawrence Young, Sascha Ott
Abstract:
Advances in technology make it now possible to retrieve the genetic information of thousands of single cancerous cells. One of the key challenges in single cell analysis of cancerous tissue is to determine the number of different cell types and their characteristic genes within the sample to better understand the tumors and their reaction to different treatments. For this analysis to be possible, it is crucial to filter out background noise as it can severely blur the downstream analysis and give misleading results. In-depth analysis of the state-of-the-art filtering methods for single cell data showed that they do, in some cases, not separate noisy and normal cells sufficiently. We introduced an algorithm that filters and clusters single cell data simultaneously without relying on certain genes or thresholds chosen by eye. It detects communities in a Shared Nearest Neighbor similarity network, which captures the similarities and dissimilarities of the cells by optimizing the modularity and then identifies and removes vertices with a weak clustering belonging. This strategy is based on the fact that noisy data instances are very likely to be similar to true cell types but do not match any of these wells. Once the clustering is complete, we apply a set of evaluation metrics on the cluster level and accept or reject clusters based on the outcome. The performance of our algorithm was tested on three datasets and led to convincing results. We were able to replicate the results on a Peripheral Blood Mononuclear Cells dataset. Furthermore, we applied the algorithm to two samples of ovarian cancer from the same patient before and after chemotherapy. Comparing the standard approach to our algorithm, we found a hidden cell type in the ovarian postchemotherapy data with interesting marker genes that are potentially relevant for medical research.Keywords: cancer research, graph theory, machine learning, single cell analysis
Procedia PDF Downloads 11222533 Data Collection Techniques for Robotics to Identify the Facial Expressions of Traumatic Brain Injured Patients
Authors: Chaudhary Muhammad Aqdus Ilyas, Matthias Rehm, Kamal Nasrollahi, Thomas B. Moeslund
Abstract:
This paper presents the investigation of data collection procedures, associated with robots when placed with traumatic brain injured (TBI) patients for rehabilitation purposes through facial expression and mood analysis. Rehabilitation after TBI is very crucial due to nature of injury and variation in recovery time. It is advantageous to analyze these emotional signals in a contactless manner, due to the non-supportive behavior of patients, limited muscle movements and increase in negative emotional expressions. This work aims at the development of framework where robots can recognize TBI emotions through facial expressions to perform rehabilitation tasks by physical, cognitive or interactive activities. The result of these studies shows that with customized data collection strategies, proposed framework identify facial and emotional expressions more accurately that can be utilized in enhancing recovery treatment and social interaction in robotic context.Keywords: computer vision, convolution neural network- long short term memory network (CNN-LSTM), facial expression and mood recognition, multimodal (RGB-thermal) analysis, rehabilitation, robots, traumatic brain injured patients
Procedia PDF Downloads 15522532 The Extended Skew Gaussian Process for Regression
Authors: M. T. Alodat
Abstract:
In this paper, we propose a generalization to the Gaussian process regression(GPR) model called the extended skew Gaussian process for regression(ESGPr) model. The ESGPR model works better than the GPR model when the errors are skewed. We derive the predictive distribution for the ESGPR model at a new input. Also we apply the ESGPR model to FOREX data and we find that it fits the Forex data better than the GPR model.Keywords: extended skew normal distribution, Gaussian process for regression, predictive distribution, ESGPr model
Procedia PDF Downloads 55322531 Training AI to Be Empathetic and Determining the Psychotype of a Person During a Conversation with a Chatbot
Authors: Aliya Grig, Konstantin Sokolov, Igor Shatalin
Abstract:
The report describes the methodology for collecting data and building an ML model for determining the personality psychotype using profiling and personality traits methods based on several short messages of a user communicating on an arbitrary topic with a chitchat bot. In the course of the experiments, the minimum amount of text was revealed to confidently determine aspects of personality. Model accuracy - 85%. Users' language of communication is English. AI for a personalized communication with a user based on his mood, personality, and current emotional state. Features investigated during the research: personalized communication; providing empathy; adaptation to a user; predictive analytics. In the report, we describe the processes that captures both structured and unstructured data pertaining to a user in large quantities and diverse forms. This data is then effectively processed through ML tools to construct a knowledge graph and draw inferences regarding users of text messages in a comprehensive manner. Specifically, the system analyzes users' behavioral patterns and predicts future scenarios based on this analysis. As a result of the experiments, we provide for further research on training AI models to be empathetic, creating personalized communication for a userKeywords: AI, empathetic, chatbot, AI models
Procedia PDF Downloads 9222530 Anemia Among Pregnant Women in Kuwait: Findings from Kuwait Birth Cohort Study
Authors: Majeda Hammoud
Abstract:
Background: Anemia during pregnancy increases the risk of delivery by cesarean section, low birth weight, preterm birth, perinatal mortality, stillbirth, and maternal mortality. In this study, we aimed to assess the prevalence of anemia in pregnant women and its associated factors in the Kuwait birth cohort study. Methods: The Kuwait birth cohort (N=1108) was a prospective cohort study in which pregnant women were recruited in the third trimester. Data were collected through personal interviews with mothers who attend antenatal care visits, including data on socio-economic status and lifestyle factors. Blood samples were taken after the recruitment to measure multiple laboratory indicators. Clinical data were extracted from the medical records by a clinician including data on comorbidities. Anemia was defined as having Hemoglobin (Hb) <110 g/L with further classification as mild (100-109 g/L), moderate (70-99 g/L), or severe (<70 g/L). Predictors of anemia were classified as underlying or direct factors, and logistic regression was used to investigate their association with anemia. Results: The mean Hb level in the study group was 115.21 g/L (95%CI: 114.56- 115.87 g/L), with significant differences between age groups (p=0.034). The prevalence of anemia was 28.16% (95%CI: 25.53-30.91%), with no significant difference by age group (p=0.164). Of all 1108 pregnant women, 8.75% had moderate anemia, and 19.40% had mild anemia, but no pregnant women had severe anemia. In multivariable analysis, getting pregnant while using contraception, adjusted odds ratio (AOR) 1.73(95%CI:1.01-2.96); p=0.046 and current use of supplements, AOR 0.50 (95%CI: 0.26-0.95); p=0.035 were significantly associated with anemia (underlying factors). From the direct factors group, only iron and ferritin levels were significantly associated with anemia (P<0.001). Conclusion: Although the severe form of anemia is low among pregnant women in Kuwait, mild and moderate anemia remains a significant health problem despite free access to antenatal care.Keywords: anemia, pregnancy, hemoglobin, ferritin
Procedia PDF Downloads 5022529 Model-Driven and Data-Driven Approaches for Crop Yield Prediction: Analysis and Comparison
Authors: Xiangtuo Chen, Paul-Henry Cournéde
Abstract:
Crop yield prediction is a paramount issue in agriculture. The main idea of this paper is to find out efficient way to predict the yield of corn based meteorological records. The prediction models used in this paper can be classified into model-driven approaches and data-driven approaches, according to the different modeling methodologies. The model-driven approaches are based on crop mechanistic modeling. They describe crop growth in interaction with their environment as dynamical systems. But the calibration process of the dynamic system comes up with much difficulty, because it turns out to be a multidimensional non-convex optimization problem. An original contribution of this paper is to propose a statistical methodology, Multi-Scenarios Parameters Estimation (MSPE), for the parametrization of potentially complex mechanistic models from a new type of datasets (climatic data, final yield in many situations). It is tested with CORNFLO, a crop model for maize growth. On the other hand, the data-driven approach for yield prediction is free of the complex biophysical process. But it has some strict requirements about the dataset. A second contribution of the paper is the comparison of these model-driven methods with classical data-driven methods. For this purpose, we consider two classes of regression methods, methods derived from linear regression (Ridge and Lasso Regression, Principal Components Regression or Partial Least Squares Regression) and machine learning methods (Random Forest, k-Nearest Neighbor, Artificial Neural Network and SVM regression). The dataset consists of 720 records of corn yield at county scale provided by the United States Department of Agriculture (USDA) and the associated climatic data. A 5-folds cross-validation process and two accuracy metrics: root mean square error of prediction(RMSEP), mean absolute error of prediction(MAEP) were used to evaluate the crop prediction capacity. The results show that among the data-driven approaches, Random Forest is the most robust and generally achieves the best prediction error (MAEP 4.27%). It also outperforms our model-driven approach (MAEP 6.11%). However, the method to calibrate the mechanistic model from dataset easy to access offers several side-perspectives. The mechanistic model can potentially help to underline the stresses suffered by the crop or to identify the biological parameters of interest for breeding purposes. For this reason, an interesting perspective is to combine these two types of approaches.Keywords: crop yield prediction, crop model, sensitivity analysis, paramater estimation, particle swarm optimization, random forest
Procedia PDF Downloads 23122528 Academic Goal Setting Practices of University Students in Lagos State, Nigeria: Implications for Counselling
Authors: Asikhia Olubusayo Aduke
Abstract:
Students’ inability to set data-based (specific, measurable, attainable, reliable, and time-bound) personal improvement goals threatens their academic success. Hence, the study aimed to investigate year-one students’ academic goal-setting practices at Lagos State University of Education, Nigeria. Descriptive survey research was used in carrying out this study. The study population consisted of 3,101 year-one students of the University. A sample size of five hundred (501) participants was selected through a proportional and simple random sampling technique. The Formative Goal Setting Questionnaire (FGSQ) developed by Research Collaboration (2015) was adapted and used as an instrument for the study. Two main research questions were answered, while two null hypotheses were formulated and tested for the study. The study revealed higher data-based goals for all students than personal improvement goals. Nevertheless, data-based and personal improvement goal-setting for female students was higher than for male students. One sample test statistic and Anova used to analyse data for the two hypotheses also revealed that the mean difference between male and female year one students’ data-based and personal improvement goal-setting formation was statistically significant (p < 0.05). This means year one students’ data-based and personal improvement goals showed significant gender differences. Based on the findings of this study, it was recommended, among others, that therapeutic techniques that can help to change students’ faulty thinking and challenge their lack of desire for personal improvement should be sought to treat students who have problems with setting high personal improvement goals. Counsellors also need to advocate continued research into how to increase the goal-setting ability of male students and should focus more on counselling male students’ goal-setting ability. The main contributions of the study are higher institutions must prioritize early intervention in first-year students' academic goal setting. Researching gender differences in this practice reveals a crucial insight: male students often lag behind in setting meaningful goals, impacting their motivation and performance. Focusing on this demographic with data-driven personal improvement goals can be transformative. By promoting goal setting that is specific, measurable, and focused on self-growth (rather than competition), male students can unlock their full potential. Researchers and counselors play a vital role in detecting and supporting students with lower goal-setting tendencies. By prioritizing this intervention, we can empower all students to set ambitious, personalized goals that ignite their passion for learning and pave the way for academic success.Keywords: academic goal setting, counselling, practice, university, year one students
Procedia PDF Downloads 6222527 Estimating X-Ray Spectra for Digital Mammography by Using the Expectation Maximization Algorithm: A Monte Carlo Simulation Study
Authors: Chieh-Chun Chang, Cheng-Ting Shih, Yan-Lin Liu, Shu-Jun Chang, Jay Wu
Abstract:
With the widespread use of digital mammography (DM), radiation dose evaluation of breasts has become important. X-ray spectra are one of the key factors that influence the absorbed dose of glandular tissue. In this study, we estimated the X-ray spectrum of DM using the expectation maximization (EM) algorithm with the transmission measurement data. The interpolating polynomial model proposed by Boone was applied to generate the initial guess of the DM spectrum with the target/filter combination of Mo/Mo and the tube voltage of 26 kVp. The Monte Carlo N-particle code (MCNP5) was used to tally the transmission data through aluminum sheets of 0.2 to 3 mm. The X-ray spectrum was reconstructed by using the EM algorithm iteratively. The influence of the initial guess for EM reconstruction was evaluated. The percentage error of the average energy between the reference spectrum inputted for Monte Carlo simulation and the spectrum estimated by the EM algorithm was -0.14%. The normalized root mean square error (NRMSE) and the normalized root max square error (NRMaSE) between both spectra were 0.6% and 2.3%, respectively. We conclude that the EM algorithm with transmission measurement data is a convenient and useful tool for estimating x-ray spectra for DM in clinical practice.Keywords: digital mammography, expectation maximization algorithm, X-Ray spectrum, X-Ray
Procedia PDF Downloads 73022526 Single Imputation for Audiograms
Authors: Sarah Beaver, Renee Bryce
Abstract:
Audiograms detect hearing impairment, but missing values pose problems. This work explores imputations in an attempt to improve accuracy. This work implements Linear Regression, Lasso, Linear Support Vector Regression, Bayesian Ridge, K Nearest Neighbors (KNN), and Random Forest machine learning techniques to impute audiogram frequencies ranging from 125Hz to 8000Hz. The data contains patients who had or were candidates for cochlear implants. Accuracy is compared across two different Nested Cross-Validation k values. Over 4000 audiograms were used from 800 unique patients. Additionally, training on data combines and compares left and right ear audiograms versus single ear side audiograms. The accuracy achieved using Root Mean Square Error (RMSE) values for the best models for Random Forest ranges from 4.74 to 6.37. The R\textsuperscript{2} values for the best models for Random Forest ranges from .91 to .96. The accuracy achieved using RMSE values for the best models for KNN ranges from 5.00 to 7.72. The R\textsuperscript{2} values for the best models for KNN ranges from .89 to .95. The best imputation models received R\textsuperscript{2} between .89 to .96 and RMSE values less than 8dB. We also show that the accuracy of classification predictive models performed better with our best imputation models versus constant imputations by a two percent increase.Keywords: machine learning, audiograms, data imputations, single imputations
Procedia PDF Downloads 8222525 Role of Imaging in Alzheimer's Disease Trials: Impact on Trial Planning, Patient Recruitment and Retention
Authors: Kohkan Shamsi
Abstract:
Background: MRI and PET are now extensively utilized in Alzheimer's disease (AD) trials for patient eligibility, efficacy assessment, and safety evaluations but including imaging in AD trials impacts site selection process, patient recruitment, and patient retention. Methods: PET/MRI are performed at baseline and at multiple follow-up timepoints. This requires prospective site imaging qualification, evaluation of phantom data, training and continuous monitoring of machines for acquisition of standardized and consistent data. This also requires prospective patient/caregiver training as patients must go to multiple facilities for imaging examinations. We will share our experience form one of the largest AD programs. Lesson learned: Many neurological diseases have a similar presentation as AD or could confound the assessment of drug therapy. The inclusion of wrong patients has ethical and legal issues, and data could be excluded from the analysis. Centralized eligibility evaluation read process will be discussed. Amyloid related imaging abnormalities (ARIA) were observed in amyloid-β trials. FDA recommended regular monitoring of ARIA. Our experience in ARIA evaluations in large phase III study at > 350 sites will be presented. Efficacy evaluation: MRI is utilized to evaluate various volumes of the brain. FDG PET or amyloid PET agents has been used in AD trials. We will share our experience about site and central independent reads. Imaging logistic issues that need to be handled in the planning phase will also be discussed as it can impact patient compliance thereby increasing missing data and affecting study results. Conclusion: imaging must be prospectively planned to include standardizing imaging methodologies, site selection process and selecting assessment criteria. Training should be transparently conducted and documented. Prospective patient/caregiver awareness of imaging requirement is essential for patient compliance and reduction in missing imaging data.Keywords: Alzheimer's disease, ARIA, MRI, PET, patient recruitment, retention
Procedia PDF Downloads 11522524 The Thoughts and Feelings of 60-72 Month Old Children about School and Teacher
Authors: Ayse Ozturk Samur, Gozde Inal Kiziltepe
Abstract:
No matter what level of education it is, starting a school is an exciting process as it includes new experiences. In this process, child steps into a different environment and institution except from the family institution which he was born into and feels secure. That new environment is different from home; it is a social environment which has its own rules, and involves duties and responsibilities that should be fulfilled and new vital experiences. The children who have a positive attitude towards school and like school are more enthusiastic and eager to participate in classroom activities. Moreover, a close relationship with the teacher enables the child to have positive emotions and ideas about the teacher and school and helps children adapt to school easily. In this study, it is aimed to identify children’s perceptions of academic competence, attitudes towards school and ideas about their teachers. In accordance with the aim a mixed method that includes both qualitative and quantitative data collection methods are used. The study is supported with qualitative data after collecting quantitative data. The study group of the research consists of randomly chosen 250 children who are 60-72 month old and attending a preschool institution in a city center located West Anatolian region of Turkey. Quantitative data was collected using Feelings about School scale. The scale consists of 12 items and 4 dimensions; school, teacher, mathematic, and literacy. Reliability and validity study for the scale used in the study was conducted by the researchers with 318 children who were 60-72 months old. For content validity experts’ ideas were asked, for construct validity confirmatory factor analysis was utilized. Reliability of the scale was examined by calculating internal consistency coefficient (Cronbach alpha). At the end of the analyses it was found that FAS is a valid and reliable instrument to identify 60-72 month old children’ perception of their academic competency, attitude toward school and ideas about their teachers. For the qualitative dimension of the study, semi-structured interviews were done with 30 children aged 60-72 month. At the end of the study, it was identified that children’s’ perceptions of their academic competencies and attitudes towards school was medium-level and their ideas about their teachers were high. Based on the semi structured interviews done with children, it is identified that they have a positive perception of school and teacher. That means quantitatively gathered data is supported by qualitatively collected data.Keywords: feelings, preschool education, school, teacher, thoughts
Procedia PDF Downloads 22422523 Determination of Optimum Torque of an Internal Combustion Engine by Exergy Analysis
Authors: Veena Chaudhary, Rakesh P. Gakkhar
Abstract:
In this study, energy and exergy analysis are applied to the experimental data of an internal combustion engine operating on conventional diesel cycle. The experimental data are collected using an engine unit which enables accurate measurements of fuel flow rate, combustion air flow rate, engine load, engine speed and all relevant temperatures. First and second law efficiencies are calculated for different engine speed and compared. Results indicate that the first law (energy) efficiency is maximum at 1700 rpm whereas exergy efficiency is maximum and exergy destruction is minimum at 1900 rpm.Keywords: diesel engine, exergy destruction, exergy efficiency, second law of thermodynamics
Procedia PDF Downloads 32922522 Estimation Atmospheric parameters for Weather Study and Forecast over Equatorial Regions Using Ground-Based Global Position System
Authors: Asmamaw Yehun, Tsegaye Kassa, Addisu Hunegnaw, Martin Vermeer
Abstract:
There are various models to estimate the neutral atmospheric parameter values, such as in-suite and reanalysis datasets from numerical models. Accurate estimated values of the atmospheric parameters are useful for weather forecasting and, climate modeling and monitoring of climate change. Recently, Global Navigation Satellite System (GNSS) measurements have been applied for atmospheric sounding due to its robust data quality and wide horizontal and vertical coverage. The Global Positioning System (GPS) solutions that includes tropospheric parameters constitute a reliable set of data to be assimilated into climate models. The objective of this paper is, to estimate the neutral atmospheric parameters such as Wet Zenith Delay (WZD), Precipitable Water Vapour (PWV) and Total Zenith Delay (TZD) using six selected GPS stations in the equatorial regions, more precisely, the Ethiopian GPS stations from 2012 to 2015 observational data. Based on historic estimated GPS-derived values of PWV, we forecasted the PWV from 2015 to 2030. During data processing and analysis, we applied GAMIT-GLOBK software packages to estimate the atmospheric parameters. In the result, we found that the annual averaged minimum values of PWV are 9.72 mm for IISC and maximum 50.37 mm for BJCO stations. The annual averaged minimum values of WZD are 6 cm for IISC and maximum 31 cm for BDMT stations. In the long series of observations (from 2012 to 2015), we also found that there is a trend and cyclic patterns of WZD, PWV and TZD for all stations.Keywords: atmosphere, GNSS, neutral atmosphere, precipitable water vapour
Procedia PDF Downloads 6122521 Design and Implementation a Platform for Adaptive Online Learning Based on Fuzzy Logic
Authors: Budoor Al Abid
Abstract:
Educational systems are increasingly provided as open online services, providing guidance and support for individual learners. To adapt the learning systems, a proper evaluation must be made. This paper builds the evaluation model Fuzzy C Means Adaptive System (FCMAS) based on data mining techniques to assess the difficulty of the questions. The following steps are implemented; first using a dataset from an online international learning system called (slepemapy.cz) the dataset contains over 1300000 records with 9 features for students, questions and answers information with feedback evaluation. Next, a normalization process as preprocessing step was applied. Then FCM clustering algorithms are used to adaptive the difficulty of the questions. The result is three cluster labeled data depending on the higher Wight (easy, Intermediate, difficult). The FCM algorithm gives a label to all the questions one by one. Then Random Forest (RF) Classifier model is constructed on the clustered dataset uses 70% of the dataset for training and 30% for testing; the result of the model is a 99.9% accuracy rate. This approach improves the Adaptive E-learning system because it depends on the student behavior and gives accurate results in the evaluation process more than the evaluation system that depends on feedback only.Keywords: machine learning, adaptive, fuzzy logic, data mining
Procedia PDF Downloads 19622520 Lessons of Passive Environmental Design in the Sarabhai and Shodan Houses by Le Corbusier
Authors: Juan Sebastián Rivera Soriano, Rosa Urbano Gutiérrez
Abstract:
The Shodan House and the Sarabhai House (Ahmedabad, India, 1954 and 1955, respectively) are considered some of the most important works of Le Corbusier produced in the last stage of his career. There are some academic publications that study the compositional and formal aspects of their architectural design, but there is no in-depth investigation into how the climatic conditions of this region were a determining factor in the design decisions implemented in these projects. This paper argues that Le Corbusier developed a specific architectural design strategy for these buildings based on scientific research on climate in the Indian context. This new language was informed by a pioneering study and interpretation of climatic data as a design methodology that would even involve the development of new design tools. This study investigated whether their use of climatic data meets values and levels of accuracy obtained with contemporary instruments and tools, such as Energy Plus weather data files and Climate Consultant. It also intended to find out if Le Corbusier's office’s intentions and decisions were indeed appropriate and efficient for those climate conditions by assessing these projects using BIM models and energy performance simulations from Design Builder. Accurate models were built using original historical data through archival research. The outcome is to provide a new understanding of the environment of these houses through the combination of modern building science and architectural history. The results confirm that in these houses, it was achieved a model of low energy consumption. This paper contributes new evidence not only on exemplary modern architecture concerned with environmental performance but also on how it developed progressive thinking in this direction.Keywords: bioclimatic architecture, Le Corbusier, Shodan, Sarabhai Houses
Procedia PDF Downloads 6522519 Impact of Applying Bag House Filter Technology in Cement Industry on Ambient Air Quality - Case Study: Alexandria Cement Company
Authors: Haggag H. Mohamed, Ghatass F. Zekry, Shalaby A. Elsayed
Abstract:
Most sources of air pollution in Egypt are of anthropogenic origin. Alexandria Governorate is located at north of Egypt. The main contributing sectors of air pollution in Alexandria are industry, transportation and area source due to human activities. Alexandria includes more than 40% of the industrial activities in Egypt. Cement manufacture contributes a significant amount to the particulate pollution load. Alexandria Portland Cement Company (APCC) surrounding was selected to be the study area. APCC main kiln stack Total Suspended Particulate (TSP) continuous monitoring data was collected for assessment of dust emission control technology. Electro Static Precipitator (ESP) was fixed on the cement kiln since 2002. The collected data of TSP for first quarter of 2012 was compared to that one in first quarter of 2013 after installation of new bag house filter. In the present study, based on these monitoring data and metrological data a detailed air dispersion modeling investigation was carried out using the Industrial Source Complex Short Term model (ISC3-ST) to find out the impact of applying new bag house filter control technology on the neighborhood ambient air quality. The model results show a drastic reduction of the ambient TSP hourly average concentration from 44.94μg/m3 to 5.78μg/m3 which assures the huge positive impact on the ambient air quality by applying bag house filter technology on APCC cement kilnKeywords: air pollution modeling, ambient air quality, baghouse filter, cement industry
Procedia PDF Downloads 26922518 Application of Single Subject Experimental Designs in Adapted Physical Activity Research: A Descriptive Analysis
Authors: Jiabei Zhang, Ying Qi
Abstract:
The purpose of this study was to develop a descriptive profile of the adapted physical activity research using single subject experimental designs. All research articles using single subject experimental designs published in the journal of Adapted Physical Activity Quarterly from 1984 to 2013 were employed as the data source. Each of the articles was coded in a subcategory of seven categories: (a) the size of sample; (b) the age of participants; (c) the type of disabilities; (d) the type of data analysis; (e) the type of designs, (f) the independent variable, and (g) the dependent variable. Frequencies, percentages, and trend inspection were used to analyze the data and develop a profile. The profile developed characterizes a small portion of research articles used single subject designs, in which most researchers used a small sample size, recruited children as subjects, emphasized learning and behavior impairments, selected visual inspection with descriptive statistics, preferred a multiple baseline design, focused on effects of therapy, inclusion, and strategy, and measured desired behaviors more often, with a decreasing trend over years.Keywords: adapted physical activity research, single subject experimental designs, physical education, sport science
Procedia PDF Downloads 46722517 An Alternative Credit Scoring System in China’s Consumer Lendingmarket: A System Based on Digital Footprint Data
Authors: Minjuan Sun
Abstract:
Ever since the late 1990s, China has experienced explosive growth in consumer lending, especially in short-term consumer loans, among which, the growth rate of non-bank lending has surpassed bank lending due to the development in financial technology. On the other hand, China does not have a universal credit scoring and registration system that can guide lenders during the processes of credit evaluation and risk control, for example, an individual’s bank credit records are not available for online lenders to see and vice versa. Given this context, the purpose of this paper is three-fold. First, we explore if and how alternative digital footprint data can be utilized to assess borrower’s creditworthiness. Then, we perform a comparative analysis of machine learning methods for the canonical problem of credit default prediction. Finally, we analyze, from an institutional point of view, the necessity of establishing a viable and nationally universal credit registration and scoring system utilizing online digital footprints, so that more people in China can have better access to the consumption loan market. Two different types of digital footprint data are utilized to match with bank’s loan default records. Each separately captures distinct dimensions of a person’s characteristics, such as his shopping patterns and certain aspects of his personality or inferred demographics revealed by social media features like profile image and nickname. We find both datasets can generate either acceptable or excellent prediction results, and different types of data tend to complement each other to get better performances. Typically, the traditional types of data banks normally use like income, occupation, and credit history, update over longer cycles, hence they can’t reflect more immediate changes, like the financial status changes caused by the business crisis; whereas digital footprints can update daily, weekly, or monthly, thus capable of providing a more comprehensive profile of the borrower’s credit capabilities and risks. From the empirical and quantitative examination, we believe digital footprints can become an alternative information source for creditworthiness assessment, because of their near-universal data coverage, and because they can by and large resolve the "thin-file" issue, due to the fact that digital footprints come in much larger volume and higher frequency.Keywords: credit score, digital footprint, Fintech, machine learning
Procedia PDF Downloads 16122516 The Revised Completion of Student Internship Report by Goal Mapping
Authors: Faizah Herman
Abstract:
This study aims to explore the attitudes and behavior of goal mapping performed by the student in completing the internship report revised on time. The approach is phenomenological research with qualitative methods. Data sources include observation, interviews and questionnaires, focus group discussions. Research subject 5 students who have completed the internship report revisions in a timely manner. The analysis technique is an interactive model of Miles&Huberman data analysis techniques. The results showed that the students have a goal of mapping that includes the ultimate goal, formulate goals by identifying what are the things that need to be done, action to be taken and what kind of support is needed from the environment.Keywords: goal mapping, revision internship report, students, Brawijaya
Procedia PDF Downloads 39522515 The Relationships between Market Orientation and Competitiveness of Companies in Banking Sector
Authors: Patrik Jangl, Milan Mikuláštík
Abstract:
The objective of the paper is to measure and compare market orientation of Swiss and Czech banks, as well as examine statistically the degree of influence it has on competitiveness of the institutions. The analysis of market orientation is based on the collecting, analysis and correct interpretation of the data. Descriptive analysis of market orientation describe current situation. Research of relation of competitiveness and market orientation in the sector of big international banks is suggested with the expectation of existence of a strong relationship. Partially, the work served as reconfirmation of suitability of classic methodologies to measurement of banks’ market orientation. Two types of data were gathered. Firstly, by measuring subjectively perceived market orientation of a company and secondly, by quantifying its competitiveness. All data were collected from a sample of small, mid-sized and large banks. We used numerical secondary character data from the international statistical financial Bureau Van Dijk’s BANKSCOPE database. Statistical analysis led to the following results. Assuming classical market orientation measures to be scientifically justified, Czech banks are statistically less market-oriented than Swiss banks. Secondly, among small Swiss banks, which are not broadly internationally active, small relationship exist between market orientation measures and market share based competitiveness measures. Thirdly, among all Swiss banks, a strong relationship exists between market orientation measures and market share based competitiveness measures. Above results imply existence of a strong relation of this measure in sector of big international banks. A strong statistical relationship has been proven to exist between market orientation measures and equity/total assets ratio in Switzerland.Keywords: market orientation, competitiveness, marketing strategy, measurement of market orientation, relation between market orientation and competitiveness, banking sector
Procedia PDF Downloads 47622514 Development of Academic Software for Medial Axis Determination of Porous Media from High-Resolution X-Ray Microtomography Data
Authors: S. Jurado, E. Pazmino
Abstract:
Determination of the medial axis of a porous media sample is a non-trivial problem of interest for several disciplines, e.g., hydrology, fluid dynamics, contaminant transport, filtration, oil extraction, etc. However, the computational tools available for researchers are limited and restricted. The primary aim of this work was to develop a series of algorithms to extract porosity, medial axis structure, and pore-throat size distributions from porous media domains. A complementary objective was to provide the algorithms as free computational software available to the academic community comprising researchers and students interested in 3D data processing. The burn algorithm was tested on porous media data obtained from High-Resolution X-Ray Microtomography (HRXMT) and idealized computer-generated domains. The real data and idealized domains were discretized in voxels domains of 550³ elements and binarized to denote solid and void regions to determine porosity. Subsequently, the algorithm identifies the layer of void voxels next to the solid boundaries. An iterative process removes or 'burns' void voxels in sequence of layer by layer until all the void space is characterized. Multiples strategies were tested to optimize the execution time and use of computer memory, i.e., segmentation of the overall domain in subdomains, vectorization of operations, and extraction of single burn layer data during the iterative process. The medial axis determination was conducted identifying regions where burnt layers collide. The final medial axis structure was refined to avoid concave-grain effects and utilized to determine the pore throat size distribution. A graphic user interface software was developed to encompass all these algorithms, including the generation of idealized porous media domains. The software allows input of HRXMT data to calculate porosity, medial axis, and pore-throat size distribution and provide output in tabular and graphical formats. Preliminary tests of the software developed during this study achieved medial axis, pore-throat size distribution and porosity determination of 100³, 320³ and 550³ voxel porous media domains in 2, 22, and 45 minutes, respectively in a personal computer (Intel i7 processor, 16Gb RAM). These results indicate that the software is a practical and accessible tool in postprocessing HRXMT data for the academic community.Keywords: medial axis, pore-throat distribution, porosity, porous media
Procedia PDF Downloads 11522513 Seismological Studies in Some Areas in Egypt
Authors: Gamal Seliem, Hassan Seliem
Abstract:
Aswan area is one of the important areas in Egypt and because it encompasses the vital engineering structure of the High dam, so it has been selected for the present study. The study of the crustal deformation and gravity associated with earthquake activity in the High Dam area of great importance for the safety of the High Dam and its economic resources. This paper deals with using micro-gravity, precise leveling and GPS data for geophysical and geodetically studies. For carrying out the detailed gravity survey in the area, were established for studying the subsurface structures. To study the recent vertical movements, a profile of 10 km length joins the High Dam and Aswan old dam were established along the road connecting the two dams. This profile consists of 35 GPS/leveling stations extending along the two sides of the road and on the High Dam body. Precise leveling was carried out with GPS and repeated micro-gravity survey in the same time. GPS network consisting of nine stations was established for studying the recent crustal movements. Many campaigns from December 2001 to December 2014 were performed for collecting the gravity, leveling and GPS data. The main aim of this work is to study the structural features and the behavior of the area, as depicted from repeated micro-gravity, precise leveling and GPS measurements. The present work focuses on the analysis of the gravity, leveling and GPS data. The gravity results of the present study investigate and analyze the subsurface geologic structures and reveal to there be minor structures; features and anomalies are taking W-E and N-S directions. The geodetic results indicated lower rates of the vertical and horizontal displacements and strain values. This may be related to the stability of the area.Keywords: repeated micro-gravity changes, precise leveling, GPS data, Aswan High Dam
Procedia PDF Downloads 44822512 HcDD: The Hybrid Combination of Disk Drives in Active Storage Systems
Authors: Shu Yin, Zhiyang Ding, Jianzhong Huang, Xiaojun Ruan, Xiaomin Zhu, Xiao Qin
Abstract:
Since large-scale and data-intensive applications have been widely deployed, there is a growing demand for high-performance storage systems to support data-intensive applications. Compared with traditional storage systems, next-generation systems will embrace dedicated processor to reduce computational load of host machines and will have hybrid combinations of different storage devices. The advent of flash- memory-based solid state disk has become a critical role in revolutionizing the storage world. However, instead of simply replacing the traditional magnetic hard disk with the solid state disk, it is believed that finding a complementary approach to corporate both of them is more challenging and attractive. This paper explores an idea of active storage, an emerging new storage configuration, in terms of the architecture and design, the parallel processing capability, the cooperation of other machines in cluster computing environment, and a disk configuration, the hybrid combination of different types of disk drives. Experimental results indicate that the proposed HcDD achieves better I/O performance and longer storage system lifespan.Keywords: arallel storage system, hybrid storage system, data inten- sive, solid state disks, reliability
Procedia PDF Downloads 44822511 Migration, Violent Extremism and Gang Violence in Trinidad and Tobago
Authors: Raghunath Mahabir
Abstract:
This paper provides an analysis of the existing evidence on the relationships between the migration of Venezuelans into Trinidad and Tobago, violent extremism and gang violence. Arguing that there is a dearth of reliable data on the subject matter, the paper fills the gap by providing relevant definitions of terms used, discusses the sources of data and identifies the causes for this migration and the subsequent ramifications for Trinidad and Tobago and for the migrants themselves. A simple but clear classification pointing to the nexus between migration gang violence and violent extremism is developed, following the logic of migration of criminals(gang members), the need to link with local gangs and the view that certain elements within the TnT society has become radicalized to the point where violent extremism is being displayed in different ways. The paper highlights implications for further policy debate:the imperatives for more effective communication between government officials responsible for migration and those personnel who are tasked with countering violent extremism and gang violence: promoting and executing better integration and social inclusion policies which are necessary to minimize social exclusion, and the threat of violent extremist agendas emanating from both Venezuelans and Trinidadians and generally to establish strong analytical framework grounded in stronger definitions, more reliable data and other evidence which can guide further research and analysis and contribute to policy formation.Keywords: migration, violent extremism, gangs, Venezuela
Procedia PDF Downloads 5422510 Regional Flood-Duration-Frequency Models for Norway
Authors: Danielle M. Barna, Kolbjørn Engeland, Thordis Thorarinsdottir, Chong-Yu Xu
Abstract:
Design flood values give estimates of flood magnitude within a given return period and are essential to making adaptive decisions around land use planning, infrastructure design, and disaster mitigation. Often design flood values are needed at locations with insufficient data. Additionally, in hydrologic applications where flood retention is important (e.g., floodplain management and reservoir design), design flood values are required at different flood durations. A statistical approach to this problem is a development of a regression model for extremes where some of the parameters are dependent on flood duration in addition to being covariate-dependent. In hydrology, this is called a regional flood-duration-frequency (regional-QDF) model. Typically, the underlying statistical distribution is chosen to be the Generalized Extreme Value (GEV) distribution. However, as the support of the GEV distribution depends on both its parameters and the range of the data, special care must be taken with the development of the regional model. In particular, we find that the GEV is problematic when developing a GAMLSS-type analysis due to the difficulty of proposing a link function that is independent of the unknown parameters and the observed data. We discuss these challenges in the context of developing a regional QDF model for Norway.Keywords: design flood values, bayesian statistics, regression modeling of extremes, extreme value analysis, GEV
Procedia PDF Downloads 7222509 Business Intelligence Dashboard Solutions for Improving Decision Making Process: A Focus on Prostate Cancer
Authors: Mona Isazad Mashinchi, Davood Roshan Sangachin, Francis J. Sullivan, Dietrich Rebholz-Schuhmann
Abstract:
Background: Decision-making processes are nowadays driven by data, data analytics and Business Intelligence (BI). BI as a software platform can provide a wide variety of capabilities such as organization memory, information integration, insight creation and presentation capabilities. Visualizing data through dashboards is one of the BI solutions (for a variety of areas) which helps managers in the decision making processes to expose the most informative information at a glance. In the healthcare domain to date, dashboard presentations are more frequently used to track performance related metrics and less frequently used to monitor those quality parameters which relate directly to patient outcomes. Providing effective and timely care for patients and improving the health outcome are highly dependent on presenting and visualizing data and information. Objective: In this research, the focus is on the presentation capabilities of BI to design a dashboard for prostate cancer (PC) data that allows better decision making for the patients, the hospital and the healthcare system related to a cancer dataset. The aim of this research is to customize a retrospective PC dataset in a dashboard interface to give a better understanding of data in the categories (risk factors, treatment approaches, disease control and side effects) which matter most to patients as well as other stakeholders. By presenting the outcome in the dashboard we address one of the major targets of a value-based health care (VBHC) delivery model which is measuring the value and presenting the outcome to different actors in HC industry (such as patients and doctors) for a better decision making. Method: For visualizing the stored data to users, three interactive dashboards based on the PC dataset have been developed (using the Tableau Software) to provide better views to the risk factors, treatment approaches, and side effects. Results: Many benefits derived from interactive graphs and tables in dashboards which helped to easily visualize and see the patients at risk, better understanding the relationship between patient's status after treatment and their initial status before treatment, or to choose better decision about treatments with fewer side effects regarding patient status and etc. Conclusions: Building a well-designed and informative dashboard is related to three important factors including; the users, goals and the data types. Dashboard's hierarchies, drilling, and graphical features can guide doctors to better navigate through information. The features of the interactive PC dashboard not only let doctors ask specific questions and filter the results based on the key performance indicators (KPI) such as: Gleason Grade, Patient's Age and Status, but may also help patients to better understand different treatment outcomes, such as side effects during the time, and have an active role in their treatment decisions. Currently, we are extending the results to the real-time interactive dashboard that users (either patients and doctors) can easily explore the data by choosing preferred attribute and data to make better near real-time decisions.Keywords: business intelligence, dashboard, decision making, healthcare, prostate cancer, value-based healthcare
Procedia PDF Downloads 141