Search results for: regression models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9159

Search results for: regression models

7869 A Deep Learning Approach to Real Time and Robust Vehicular Traffic Prediction

Authors: Bikis Muhammed, Sehra Sedigh Sarvestani, Ali R. Hurson, Lasanthi Gamage

Abstract:

Vehicular traffic events have overly complex spatial correlations and temporal interdependencies and are also influenced by environmental events such as weather conditions. To capture these spatial and temporal interdependencies and make more realistic vehicular traffic predictions, graph neural networks (GNN) based traffic prediction models have been extensively utilized due to their capability of capturing non-Euclidean spatial correlation very effectively. However, most of the already existing GNN-based traffic prediction models have some limitations during learning complex and dynamic spatial and temporal patterns due to the following missing factors. First, most GNN-based traffic prediction models have used static distance or sometimes haversine distance mechanisms between spatially separated traffic observations to estimate spatial correlation. Secondly, most GNN-based traffic prediction models have not incorporated environmental events that have a major impact on the normal traffic states. Finally, most of the GNN-based models did not use an attention mechanism to focus on only important traffic observations. The objective of this paper is to study and make real-time vehicular traffic predictions while incorporating the effect of weather conditions. To fill the previously mentioned gaps, our prediction model uses a real-time driving distance between sensors to build a distance matrix or spatial adjacency matrix and capture spatial correlation. In addition, our prediction model considers the effect of six types of weather conditions and has an attention mechanism in both spatial and temporal data aggregation. Our prediction model efficiently captures the spatial and temporal correlation between traffic events, and it relies on the graph attention network (GAT) and Bidirectional bidirectional long short-term memory (Bi-LSTM) plus attention layers and is called GAT-BILSTMA.

Keywords: deep learning, real time prediction, GAT, Bi-LSTM, attention

Procedia PDF Downloads 67
7868 An Analysis of Packaging Materials for an Energy-Efficient Wrapping System

Authors: John Sweeney, Martin Leeming, Raj Thaker, Cristina L. Tuinea-Bobe

Abstract:

Shrink wrapping is widely used as a method for secondary packaging to assemble individual items, such as cans or other consumer products, into single packages. This method involves conveying the packages into heated tunnels and so has the disadvantages that it is energy-intensive, and, in the case of aerosol products, potentially hazardous. We are developing an automated packaging system that uses stretch wrapping to address both these problems, by using a mechanical rather than a thermal process. In this study, we present a comparative study of shrink wrapping and stretch wrapping materials to assess the relative capability of candidate stretch wrap polymer film in terms of mechanical response. The stretch wrap materials are of oriented polymer and therefore elastically anisotropic. We are developing material constitutive models that include both anisotropy and nonlinearity. These material models are to be incorporated into computer simulations of the automated stretch wrapping system. We present results showing the validity of these models and the feasibility of applying them in the simulations.

Keywords: constitutive model, polymer, mechanical testing, wrapping system

Procedia PDF Downloads 291
7867 Grading Histopathology Features of Graft-Versus-Host Disease in Animal Models; A Systematic Review

Authors: Hami Ashraf, Farid Kosari

Abstract:

Graft-versus-host disease (GvHD) is a common complication of allogeneic hematopoietic stem cell transplantation that can lead to significant morbidity and mortality. Histopathological examination of affected tissues is an essential tool for diagnosing and grading GvHD in animal models, which are used to study disease mechanisms and evaluate new therapies. In this systematic review, we identified and analyzed original research articles in PubMed, Scopus, Web of Science, and Google Scholar that described grading systems for GvHD in animal models based on histopathological features. We found that several grading systems have been developed, which vary in the tissues and criteria they assess, the severity scoring scales they use, and the level of detail they provide. Skin, liver, and gut are the most commonly evaluated tissues, but lung and thymus are also included in some systems. Our analysis highlights the need for standardized criteria and consistent use of grading systems to enable comparisons between studies and facilitate the translation of preclinical findings to clinical practice.

Keywords: graft-versus-host disease, GvHD, animal model, histopathology, grading system

Procedia PDF Downloads 62
7866 An Unified Model for Longshore Sediment Transport Rate Estimation

Authors: Aleksandra Dudkowska, Gabriela Gic-Grusza

Abstract:

Wind wave-induced sediment transport is an important multidimensional and multiscale dynamic process affecting coastal seabed changes and coastline evolution. The knowledge about sediment transport rate is important to solve many environmental and geotechnical issues. There are many types of sediment transport models but none of them is widely accepted. It is bacause the process is not fully defined. Another problem is a lack of sufficient measurment data to verify proposed hypothesis. There are different types of models for longshore sediment transport (LST, which is discussed in this work) and cross-shore transport which is related to different time and space scales of the processes. There are models describing bed-load transport (discussed in this work), suspended and total sediment transport. LST models use among the others the information about (i) the flow velocity near the bottom, which in case of wave-currents interaction in coastal zone is a separate problem (ii) critical bed shear stress that strongly depends on the type of sediment and complicates in the case of heterogeneous sediment. Moreover, LST rate is strongly dependant on the local environmental conditions. To organize existing knowledge a series of sediment transport models intercomparisons was carried out as a part of the project “Development of a predictive model of morphodynamic changes in the coastal zone”. Four classical one-grid-point models were studied and intercompared over wide range of bottom shear stress conditions, corresponding with wind-waves conditions appropriate for coastal zone in polish marine areas. The set of models comprises classical theories that assume simplified influence of turbulence on the sediment transport (Du Boys, Meyer-Peter & Muller, Ribberink, Engelund & Hansen). It turned out that the values of estimated longshore instantaneous mass sediment transport are in general in agreement with earlier studies and measurements conducted in the area of interest. However, none of the formulas really stands out from the rest as being particularly suitable for the test location over the whole analyzed flow velocity range. Therefore, based on the models discussed a new unified formula for longshore sediment transport rate estimation is introduced, which constitutes the main original result of this study. Sediment transport rate is calculated based on the bed shear stress and critical bed shear stress. The dependence of environmental conditions is expressed by one coefficient (in a form of constant or function) thus the model presented can be quite easily adjusted to the local conditions. The discussion of the importance of each model parameter for specific velocity ranges is carried out. Moreover, it is shown that the value of near-bottom flow velocity is the main determinant of longshore bed-load in storm conditions. Thus, the accuracy of the results depends less on the sediment transport model itself and more on the appropriate modeling of the near-bottom velocities.

Keywords: bedload transport, longshore sediment transport, sediment transport models, coastal zone

Procedia PDF Downloads 384
7865 A New Mathematical Model of Human Olfaction

Authors: H. Namazi, H. T. N. Kuan

Abstract:

It is known that in humans, the adaptation to a given odor occurs within a quite short span of time (typically one minute) after the odor is presented to the brain. Different models of human olfaction have been developed by scientists but none of these models consider the diffusion phenomenon in olfaction. A novel microscopic model of the human olfaction is presented in this paper. We develop this model by incorporating the transient diffusivity. In fact, the mathematical model is written based on diffusion of the odorant within the mucus layer. By the use of the model developed in this paper, it becomes possible to provide quantification of the objective strength of odor.

Keywords: diffusion, microscopic model, mucus layer, olfaction

Procedia PDF Downloads 499
7864 Liesegang Phenomena: Experimental and Simulation Studies

Authors: Vemula Amalakrishna, S. Pushpavanam

Abstract:

Change and motion characterize and persistently reshape the world around us, on scales from molecular to global. The subtle interplay between change (Reaction) and motion (Diffusion) gives rise to an astonishing intricate spatial or temporal pattern. These pattern formation in nature has been intellectually appealing for many scientists since antiquity. Periodic precipitation patterns, also known as Liesegang patterns (LP), are one of the stimulating examples of such self-assembling reaction-diffusion (RD) systems. LP formation has a great potential in micro and nanotechnology. So far, the research on LPs has been concentrated mostly on how these patterns are forming, retrieving information to build a universal mathematical model for them. Researchers have developed various theoretical models to comprehensively construct the geometrical diversity of LPs. To the best of our knowledge, simulation studies of LPs assume an arbitrary value of RD parameters to explain experimental observation qualitatively. In this work, existing models were studied to understand the mechanism behind this phenomenon and challenges pertaining to models were understood and explained. These models are not computationally effective due to the presence of discontinuous precipitation rate in RD equations. To overcome the computational challenges, smoothened Heaviside functions have been introduced, which downsizes the computational time as well. Experiments were performed using a conventional LP system (AgNO₃-K₂Cr₂O₇) to understand the effects of different gels and temperatures on formed LPs. The model is extended for real parameter values to compare the simulated results with experimental data for both 1-D (Cartesian test tubes) and 2-D(cylindrical and Petri dish).

Keywords: reaction-diffusion, spatio-temporal patterns, nucleation and growth, supersaturation

Procedia PDF Downloads 149
7863 Estimation of Missing Values in Aggregate Level Spatial Data

Authors: Amitha Puranik, V. S. Binu, Seena Biju

Abstract:

Missing data is a common problem in spatial analysis especially at the aggregate level. Missing can either occur in covariate or in response variable or in both in a given location. Many missing data techniques are available to estimate the missing data values but not all of these methods can be applied on spatial data since the data are autocorrelated. Hence there is a need to develop a method that estimates the missing values in both response variable and covariates in spatial data by taking account of the spatial autocorrelation. The present study aims to develop a model to estimate the missing data points at the aggregate level in spatial data by accounting for (a) Spatial autocorrelation of the response variable (b) Spatial autocorrelation of covariates and (c) Correlation between covariates and the response variable. Estimating the missing values of spatial data requires a model that explicitly account for the spatial autocorrelation. The proposed model not only accounts for spatial autocorrelation but also utilizes the correlation that exists between covariates, within covariates and between a response variable and covariates. The precise estimation of the missing data points in spatial data will result in an increased precision of the estimated effects of independent variables on the response variable in spatial regression analysis.

Keywords: spatial regression, missing data estimation, spatial autocorrelation, simulation analysis

Procedia PDF Downloads 375
7862 Simultaneous Determination of Methotrexate and Aspirin Using Fourier Transform Convolution Emission Data under Non-Parametric Linear Regression Method

Authors: Marwa A. A. Ragab, Hadir M. Maher, Eman I. El-Kimary

Abstract:

Co-administration of methotrexate (MTX) and aspirin (ASP) can cause a pharmacokinetic interaction and a subsequent increase in blood MTX concentrations which may increase the risk of MTX toxicity. Therefore, it is important to develop a sensitive, selective, accurate and precise method for their simultaneous determination in urine. A new hybrid chemometric method has been applied to the emission response data of the two drugs. Spectrofluorimetric method for determination of MTX through measurement of its acid-degradation product, 4-amino-4-deoxy-10-methylpteroic acid (4-AMP), was developed. Moreover, the acid-catalyzed degradation reaction enables the spectrofluorimetric determination of ASP through the formation of its active metabolite salicylic acid (SA). The proposed chemometric method deals with convolution of emission data using 8-points sin xi polynomials (discrete Fourier functions) after the derivative treatment of these emission data. The first and second derivative curves (D1 & D2) were obtained first then convolution of these curves was done to obtain first and second derivative under Fourier functions curves (D1/FF) and (D2/FF). This new application was used for the resolution of the overlapped emission bands of the degradation products of both drugs to allow their simultaneous indirect determination in human urine. Not only this chemometric approach was applied to the emission data but also the obtained data were subjected to non-parametric linear regression analysis (Theil’s method). The proposed method was fully validated according to the ICH guidelines and it yielded linearity ranges as follows: 0.05-0.75 and 0.5-2.5 µg mL-1 for MTX and ASP respectively. It was found that the non-parametric method was superior over the parametric one in the simultaneous determination of MTX and ASP after the chemometric treatment of the emission spectra of their degradation products. The work combines the advantages of derivative and convolution using discrete Fourier function together with the reliability and efficacy of the non-parametric analysis of data. The achieved sensitivity along with the low values of LOD (0.01 and 0.06 µg mL-1) and LOQ (0.04 and 0.2 µg mL-1) for MTX and ASP respectively, by the second derivative under Fourier functions (D2/FF) were promising and guarantee its application for monitoring the two drugs in patients’ urine samples.

Keywords: chemometrics, emission curves, derivative, convolution, Fourier transform, human urine, non-parametric regression, Theil’s method

Procedia PDF Downloads 427
7861 Analytical and Numerical Results for Free Vibration of Laminated Composites Plates

Authors: Mohamed Amine Ben Henni, Taher Hassaine Daouadji, Boussad Abbes, Yu Ming Li, Fazilay Abbes

Abstract:

The reinforcement and repair of concrete structures by bonding composite materials have become relatively common operations. Different types of composite materials can be used: carbon fiber reinforced polymer (CFRP), glass fiber reinforced polymer (GFRP) as well as functionally graded material (FGM). The development of analytical and numerical models describing the mechanical behavior of structures in civil engineering reinforced by composite materials is necessary. These models will enable engineers to select, design, and size adequate reinforcements for the various types of damaged structures. This study focuses on the free vibration behavior of orthotropic laminated composite plates using a refined shear deformation theory. In these models, the distribution of transverse shear stresses is considered as parabolic satisfying the zero-shear stress condition on the top and bottom surfaces of the plates without using shear correction factors. In this analysis, the equation of motion for simply supported thick laminated rectangular plates is obtained by using the Hamilton’s principle. The accuracy of the developed model is demonstrated by comparing our results with solutions derived from other higher order models and with data found in the literature. Besides, a finite-element analysis is used to calculate the natural frequencies of laminated composite plates and is compared with those obtained by the analytical approach.

Keywords: composites materials, laminated composite plate, finite-element analysis, free vibration

Procedia PDF Downloads 289
7860 Image Captioning with Vision-Language Models

Authors: Promise Ekpo Osaine, Daniel Melesse

Abstract:

Image captioning is an active area of research in the multi-modal artificial intelligence (AI) community as it connects vision and language understanding, especially in settings where it is required that a model understands the content shown in an image and generates semantically and grammatically correct descriptions. In this project, we followed a standard approach to a deep learning-based image captioning model, injecting architecture for the encoder-decoder setup, where the encoder extracts image features, and the decoder generates a sequence of words that represents the image content. As such, we investigated image encoders, which are ResNet101, InceptionResNetV2, EfficientNetB7, EfficientNetV2M, and CLIP. As a caption generation structure, we explored long short-term memory (LSTM). The CLIP-LSTM model demonstrated superior performance compared to the encoder-decoder models, achieving a BLEU-1 score of 0.904 and a BLEU-4 score of 0.640. Additionally, among the CNN-LSTM models, EfficientNetV2M-LSTM exhibited the highest performance with a BLEU-1 score of 0.896 and a BLEU-4 score of 0.586 while using a single-layer LSTM.

Keywords: multi-modal AI systems, image captioning, encoder, decoder, BLUE score

Procedia PDF Downloads 70
7859 The Psychological Significance of Cultural and Religious Values Among the Arab Population

Authors: Michel Mikhail

Abstract:

Introduction: Values, which are the guiding principles and beliefs of our lives, have an influence on one’s psychological health. This study aims to investigate how Schwartz’s four higher-order values (conservation, openness to change, self-transcendence, and self-enhancement) and religious values influence psychological health among the Arab population. Methods: A total of 1,023 respondents from nine Arab countries aged 18 to 71 filled out an online survey with measures of the following constructs: Schwartz’s four higher-order values (Portrait Value Questionnaire-21), religious values (Sahin’s Index of Islamic Moral Values), and general psychological health (General Health Questionnaire-28). Results: Two models of multiple regression were conducted to investigate the relationships between values and psychological health. Higher conservation, self-enhancement, and religious values were significantly associated with better psychological health, with conservation losing significance after adding religious values to the model. All of Schwartz’s four values were found to have a significant relationship with religious values. More self-enhancement and conservation values were associated with higher identification of religious values, and the opposite was true for the other two values. Conclusion: The findings challenged existing assumptions that conservation values relate negatively to psychological health. This finding could be explained by the congruence of conservation values and the Arab culture. The most powerful relationships were those of self-enhancement and religious values, both of which were positively associated with psychological health. As such, therapists should be aware to reconsider biases against religious or conservation values and rather pay attention to their potential positive influence over one’s psychological health.

Keywords: counseling psychology, counseling and cultural values, counseling and religious values, psychotherapy and Arab values

Procedia PDF Downloads 40
7858 High Resolution Satellite Imagery and Lidar Data for Object-Based Tree Species Classification in Quebec, Canada

Authors: Bilel Chalghaf, Mathieu Varin

Abstract:

Forest characterization in Quebec, Canada, is usually assessed based on photo-interpretation at the stand level. For species identification, this often results in a lack of precision. Very high spatial resolution imagery, such as DigitalGlobe, and Light Detection and Ranging (LiDAR), have the potential to overcome the limitations of aerial imagery. To date, few studies have used that data to map a large number of species at the tree level using machine learning techniques. The main objective of this study is to map 11 individual high tree species ( > 17m) at the tree level using an object-based approach in the broadleaf forest of Kenauk Nature, Quebec. For the individual tree crown segmentation, three canopy-height models (CHMs) from LiDAR data were assessed: 1) the original, 2) a filtered, and 3) a corrected model. The corrected CHM gave the best accuracy and was then coupled with imagery to refine tree species crown identification. When compared with photo-interpretation, 90% of the objects represented a single species. For modeling, 313 variables were derived from 16-band WorldView-3 imagery and LiDAR data, using radiance, reflectance, pixel, and object-based calculation techniques. Variable selection procedures were employed to reduce their number from 313 to 16, using only 11 bands to aid reproducibility. For classification, a global approach using all 11 species was compared to a semi-hierarchical hybrid classification approach at two levels: (1) tree type (broadleaf/conifer) and (2) individual broadleaf (five) and conifer (six) species. Five different model techniques were used: (1) support vector machine (SVM), (2) classification and regression tree (CART), (3) random forest (RF), (4) k-nearest neighbors (k-NN), and (5) linear discriminant analysis (LDA). Each model was tuned separately for all approaches and levels. For the global approach, the best model was the SVM using eight variables (overall accuracy (OA): 80%, Kappa: 0.77). With the semi-hierarchical hybrid approach, at the tree type level, the best model was the k-NN using six variables (OA: 100% and Kappa: 1.00). At the level of identifying broadleaf and conifer species, the best model was the SVM, with OA of 80% and 97% and Kappa values of 0.74 and 0.97, respectively, using seven variables for both models. This paper demonstrates that a hybrid classification approach gives better results and that using 16-band WorldView-3 with LiDAR data leads to more precise predictions for tree segmentation and classification, especially when the number of tree species is large.

Keywords: tree species, object-based, classification, multispectral, machine learning, WorldView-3, LiDAR

Procedia PDF Downloads 128
7857 Empirical Analyses of Students’ Self-Concepts and Their Mathematics Achievements

Authors: Adetunji Abiola Olaoye

Abstract:

The study examined the students’ self-concepts and mathematics achievement viz-a-viz the existing three theoretical models: Humanist self-concept (M1), Contemporary self-concept (M2) and Skills development self-concept (M3). As a qualitative research study, it comprised of one research question, which was transformed into hypothesis viz-a-viz the existing theoretical models. Sample to the study comprised of twelve public secondary schools from which twenty-five mathematics teachers, twelve counselling officers and one thousand students of Upper Basic II were selected based on intact class as school administrations and system did not allow for randomization. Two instruments namely 10 items ‘Achievement test in Mathematics’ (r1=0.81) and 10 items Student’s self-concept questionnaire (r2=0.75) were adapted, validated and used for the study. Data were analysed through descriptive, one way ANOVA, t-test and correlation statistics at 5% level of significance. Finding revealed mean and standard deviation of pre-achievement test scores of (51.322, 16.10), (54.461, 17.85) and (56.451, 18.22) for the Humanist Self-Concept, Contemporary Self-Concept and Skill Development Self-Concept respectively. Apart from that study showed that there was significant different in the academic performance of students along the existing models (F-cal>F-value, df = (2,997); P<0.05). Furthermore, study revealed students’ achievement in mathematics and self-concept questionnaire with the mean and standard deviation of (57.4, 11.35) and (81.6, 16.49) respectively. Result confirmed an affirmative relationship with the Contemporary Self-Concept model that expressed an individual subject and specific self-concept as the primary determinants of higher academic achievement in the subject as there is a statistical correlation between students’ self-concept and mathematics achievement viz-a-viz the existing three theoretical models of Contemporary (M2) with -Z_cal<-Z_val, df=998: P<0.05*. The implication of the study was discussed with recommendations and suggestion for further studies proffered.

Keywords: contemporary, humanists, self-concepts, skill development

Procedia PDF Downloads 233
7856 Losing Benefits from Social Network Sites Usage: An Approach to Estimate the Relationship between Social Network Sites Usage and Social Capital

Authors: Maoxin Ye

Abstract:

This study examines the relationship between social network sites (SNS) usage and social capital. Because SNS usage can expand the users’ networks, and people who are connected in this networks may become resources to SNS users and lead them to advantage in some situation, it is important to estimate the relationship between SNS usage and ‘who’ is connected or what resources the SNS users can get. Additionally, ‘who’ can be divided in two aspects – people who possess high position and people who are different, hence, it is important to estimate the relationship between SNS usage and high position people and different people. This study adapts Lin’s definition of social capital and the measurement of position generator which tells us who was connected, and can be divided into the same two aspects as well. A national data of America (N = 2,255) collected by Pew Research Center is utilized to do a general regression analysis about SNS usage and social capital. The results indicate that SNS usage is negatively associated with each factor of social capital, and it suggests that, in fact, comparing with non-users, although SNS users can get more connections, the variety and resources of these connections are fewer. For this reason, we could lose benefits through SNS usage.

Keywords: social network sites, social capital, position generator, general regression

Procedia PDF Downloads 260
7855 Optimized Text Summarization Model on Mobile Screens for Sight-Interpreters: An Empirical Study

Authors: Jianhua Wang

Abstract:

To obtain key information quickly from long texts on small screens of mobile devices, sight-interpreters need to establish optimized summarization model for fast information retrieval. Four summarization models based on previous studies were studied including title+key words (TKW), title+topic sentences (TTS), key words+topic sentences (KWTS) and title+key words+topic sentences (TKWTS). Psychological experiments were conducted on the four models for three different genres of interpreting texts to establish the optimized summarization model for sight-interpreters. This empirical study shows that the optimized summarization model for sight-interpreters to quickly grasp the key information of the texts they interpret is title+key words (TKW) for cultural texts, title+key words+topic sentences (TKWTS) for economic texts and topic sentences+key words (TSKW) for political texts.

Keywords: different genres, mobile screens, optimized summarization models, sight-interpreters

Procedia PDF Downloads 308
7854 The Relation between Spiritual Intelligence and Organizational Health and Job Satisfaction among the Female Staff in Islamic Azad University of Marvdasht

Authors: Reza Zarei

Abstract:

The result of the present study is to determine the relation between spiritual intelligence and organizational health and job satisfaction among the female staff in Islamic Azad University of Marvdasht. The population of the study includes the female staff and the faculty of Islamic Azad University of Marvdasht. The method is correlational and the instrument in the research is three questionnaires namely the spiritual intelligence by (ISIS), Amraam and Dryer, organizational health by Fieldman and Job satisfaction questionnaire. In order to test the hypotheses we used interpretive statistics, Pearson and regression correlation coefficient. The findings show that there is a significant relation between the spiritual intelligence and organizational health among the female staff of this unit. In addition, the organizational health has a significant relation with the elements of self-consciousness and social skills and on the other hand, job satisfaction is in significant relation with the elements of self-consciousness, self-control, self-provocation, sympathy and social skills in the whole sample regardless of the participants' gender. Finally, the results of multiple regression and variance analysis showed that using the variables of the spiritual intelligence of the female staff could predict the organizational health and their job satisfaction.

Keywords: job satisfaction, spiritual intelligence, organizational health, Islamic Azad University

Procedia PDF Downloads 372
7853 Revalidation and Hormonization of Existing IFCC Standardized Hepatic, Cardiac, and Thyroid Function Tests by Precison Optimization and External Quality Assurance Programs

Authors: Junaid Mahmood Alam

Abstract:

Revalidating and harmonizing clinical chemistry analytical principles and optimizing methods through quality control programs and assessments is the preeminent means to attain optimal outcome within the clinical laboratory services. Present study reports revalidation of our existing IFCC regularized analytical methods, particularly hepatic and thyroid function tests, by optimization of precision analyses and processing through external and internal quality assessments and regression determination. Parametric components of hepatic (Bilirubin ALT, γGT, ALP), cardiac (LDH, AST, Trop I) and thyroid/pituitary (T3, T4, TSH, FT3, FT4) function tests were used to validate analytical techniques on automated chemistry and immunological analyzers namely Hitachi 912, Cobas 6000 e601, Cobas c501, Cobas e411 with UV kinetic, colorimetric dry chemistry principles and Electro-Chemiluminescence immunoassay (ECLi) techniques. Process of validation and revalidation was completed with evaluating and assessing the precision analyzed Preci-control data of various instruments plotting against each other with regression analyses R2. Results showed that: Revalidation and optimization of respective parameters that were accredited through CAP, CLSI and NEQAPP assessments depicted 99.0% to 99.8% optimization, in addition to the methodology and instruments used for analyses. Regression R2 analysis of BilT was 0.996, whereas that of ALT, ALP, γGT, LDH, AST, Trop I, T3, T4, TSH, FT3, and FT4 exhibited R2 0.998, 0.997, 0.993, 0.967, 0.970, 0.980, 0.976, 0.996, 0.997, 0.997, and R2 0.990, respectively. This confirmed marked harmonization of analytical methods and instrumentations thus revalidating optimized precision standardization as per IFCC recommended guidelines. It is concluded that practices of revalidating and harmonizing the existing or any new services should be followed by all clinical laboratories, especially those associated with tertiary care hospital. This is will ensure deliverance of standardized, proficiency tested, optimized services for prompt and better patient care that will guarantee maximum patients’ confidence.

Keywords: revalidation, standardized, IFCC, CAP, harmonized

Procedia PDF Downloads 266
7852 Modeling Spatio-Temporal Variation in Rainfall Using a Hierarchical Bayesian Regression Model

Authors: Sabyasachi Mukhopadhyay, Joseph Ogutu, Gundula Bartzke, Hans-Peter Piepho

Abstract:

Rainfall is a critical component of climate governing vegetation growth and production, forage availability and quality for herbivores. However, reliable rainfall measurements are not always available, making it necessary to predict rainfall values for particular locations through time. Predicting rainfall in space and time can be a complex and challenging task, especially where the rain gauge network is sparse and measurements are not recorded consistently for all rain gauges, leading to many missing values. Here, we develop a flexible Bayesian model for predicting rainfall in space and time and apply it to Narok County, situated in southwestern Kenya, using data collected at 23 rain gauges from 1965 to 2015. Narok County encompasses the Maasai Mara ecosystem, the northern-most section of the Mara-Serengeti ecosystem, famous for its diverse and abundant large mammal populations and spectacular migration of enormous herds of wildebeest, zebra and Thomson's gazelle. The model incorporates geographical and meteorological predictor variables, including elevation, distance to Lake Victoria and minimum temperature. We assess the efficiency of the model by comparing it empirically with the established Gaussian process, Kriging, simple linear and Bayesian linear models. We use the model to predict total monthly rainfall and its standard error for all 5 * 5 km grid cells in Narok County. Using the Monte Carlo integration method, we estimate seasonal and annual rainfall and their standard errors for 29 sub-regions in Narok. Finally, we use the predicted rainfall to predict large herbivore biomass in the Maasai Mara ecosystem on a 5 * 5 km grid for both the wet and dry seasons. We show that herbivore biomass increases with rainfall in both seasons. The model can handle data from a sparse network of observations with many missing values and performs at least as well as or better than four established and widely used models, on the Narok data set. The model produces rainfall predictions consistent with expectation and in good agreement with the blended station and satellite rainfall values. The predictions are precise enough for most practical purposes. The model is very general and applicable to other variables besides rainfall.

Keywords: non-stationary covariance function, gaussian process, ungulate biomass, MCMC, maasai mara ecosystem

Procedia PDF Downloads 291
7851 Model Observability – A Monitoring Solution for Machine Learning Models

Authors: Amreth Chandrasehar

Abstract:

Machine Learning (ML) Models are developed and run in production to solve various use cases that help organizations to be more efficient and help drive the business. But this comes at a massive development cost and lost business opportunities. According to the Gartner report, 85% of data science projects fail, and one of the factors impacting this is not paying attention to Model Observability. Model Observability helps the developers and operators to pinpoint the model performance issues data drift and help identify root cause of issues. This paper focuses on providing insights into incorporating model observability in model development and operationalizing it in production.

Keywords: model observability, monitoring, drift detection, ML observability platform

Procedia PDF Downloads 107
7850 An Application of Sinc Function to Approximate Quadrature Integrals in Generalized Linear Mixed Models

Authors: Altaf H. Khan, Frank Stenger, Mohammed A. Hussein, Reaz A. Chaudhuri, Sameera Asif

Abstract:

This paper discusses a novel approach to approximate quadrature integrals that arise in the estimation of likelihood parameters for the generalized linear mixed models (GLMM) as well as Bayesian methodology also requires computation of multidimensional integrals with respect to the posterior distributions in which computation are not only tedious and cumbersome rather in some situations impossible to find solutions because of singularities, irregular domains, etc. An attempt has been made in this work to apply Sinc function based quadrature rules to approximate intractable integrals, as there are several advantages of using Sinc based methods, for example: order of convergence is exponential, works very well in the neighborhood of singularities, in general quite stable and provide high accurate and double precisions estimates. The Sinc function based approach seems to be utilized first time in statistical domain to our knowledge, and it's viability and future scopes have been discussed to apply in the estimation of parameters for GLMM models as well as some other statistical areas.

Keywords: generalized linear mixed model, likelihood parameters, qudarature, Sinc function

Procedia PDF Downloads 391
7849 Metacognitive Processing in Early Readers: The Role of Metacognition in Monitoring Linguistic and Non-Linguistic Performance and Regulating Students' Learning

Authors: Ioanna Taouki, Marie Lallier, David Soto

Abstract:

Metacognition refers to the capacity to reflect upon our own cognitive processes. Although there is an ongoing discussion in the literature on the role of metacognition in learning and academic achievement, little is known about its neurodevelopmental trajectories in early childhood, when children begin to receive formal education in reading. Here, we evaluate the metacognitive ability, estimated under a recently developed Signal Detection Theory model, of a cohort of children aged between 6 and 7 (N=60), who performed three two-alternative-forced-choice tasks (two linguistic: lexical decision task, visual attention span task, and one non-linguistic: emotion recognition task) including trial-by-trial confidence judgements. Our study has three aims. First, we investigated how metacognitive ability (i.e., how confidence ratings track accuracy in the task) relates to performance in general standardized tasks related to students' reading and general cognitive abilities using Spearman's and Bayesian correlation analysis. Second, we assessed whether or not young children recruit common mechanisms supporting metacognition across the different task domains or whether there is evidence for domain-specific metacognition at this early stage of development. This was done by examining correlations in metacognitive measures across different task domains and evaluating cross-task covariance by applying a hierarchical Bayesian model. Third, using robust linear regression and Bayesian regression models, we assessed whether metacognitive ability in this early stage is related to the longitudinal learning of children in a linguistic and a non-linguistic task. Notably, we did not observe any association between students’ reading skills and metacognitive processing in this early stage of reading acquisition. Some evidence consistent with domain-general metacognition was found, with significant positive correlations between metacognitive efficiency between lexical and emotion recognition tasks and substantial covariance indicated by the Bayesian model. However, no reliable correlations were found between metacognitive performance in the visual attention span and the remaining tasks. Remarkably, metacognitive ability significantly predicted children's learning in linguistic and non-linguistic domains a year later. These results suggest that metacognitive skill may be dissociated to some extent from general (i.e., language and attention) abilities and further stress the importance of creating educational programs that foster students’ metacognitive ability as a tool for long term learning. More research is crucial to understand whether these programs can enhance metacognitive ability as a transferable skill across distinct domains or whether unique domains should be targeted separately.

Keywords: confidence ratings, development, metacognitive efficiency, reading acquisition

Procedia PDF Downloads 147
7848 Co-payment Strategies for Chronic Medications: A Qualitative and Comparative Analysis at European Level

Authors: Pedro M. Abreu, Bruno R. Mendes

Abstract:

The management of pharmacotherapy and the process of dispensing medicines is becoming critical in clinical pharmacy due to the increase of incidence and prevalence of chronic diseases, the complexity and customization of therapeutic regimens, the introduction of innovative and more expensive medicines, the unbalanced relation between expenditure and revenue as well as due to the lack of rationalization associated with medication use. For these reasons, co-payments emerged in Europe in the 70s and have been applied over the past few years in healthcare. Co-payments lead to a rationing and rationalization of user’s access under healthcare services and products, and simultaneously, to a qualification and improvement of the services and products for the end-user. This analysis, under hospital practices particularly and co-payment strategies in general, was carried out on all the European regions and identified four reference countries, that apply repeatedly this tool and with different approaches. The structure, content and adaptation of European co-payments were analyzed through 7 qualitative attributes and 19 performance indicators, and the results expressed in a scorecard, allowing to conclude that the German models (total score of 68,2% and 63,6% in both elected co-payments) can collect more compliance and effectiveness, the English models (total score of 50%) can be more accessible, and the French models (total score of 50%) can be more adequate to the socio-economic and legal framework. Other European models did not show the same quality and/or performance, so were not taken as a standard in the future design of co-payments strategies. In this sense, we can see in the co-payments a strategy not only to moderate the consumption of healthcare products and services, but especially to improve them, as well as a strategy to increment the value that the end-user assigns to these services and products, such as medicines.

Keywords: clinical pharmacy, co-payments, healthcare, medicines

Procedia PDF Downloads 248
7847 Fuzzy-Machine Learning Models for the Prediction of Fire Outbreak: A Comparative Analysis

Authors: Uduak Umoh, Imo Eyoh, Emmauel Nyoho

Abstract:

This paper compares fuzzy-machine learning algorithms such as Support Vector Machine (SVM), and K-Nearest Neighbor (KNN) for the predicting cases of fire outbreak. The paper uses the fire outbreak dataset with three features (Temperature, Smoke, and Flame). The data is pre-processed using Interval Type-2 Fuzzy Logic (IT2FL) algorithm. Min-Max Normalization and Principal Component Analysis (PCA) are used to predict feature labels in the dataset, normalize the dataset, and select relevant features respectively. The output of the pre-processing is a dataset with two principal components (PC1 and PC2). The pre-processed dataset is then used in the training of the aforementioned machine learning models. K-fold (with K=10) cross-validation method is used to evaluate the performance of the models using the matrices – ROC (Receiver Operating Curve), Specificity, and Sensitivity. The model is also tested with 20% of the dataset. The validation result shows KNN is the better model for fire outbreak detection with an ROC value of 0.99878, followed by SVM with an ROC value of 0.99753.

Keywords: Machine Learning Algorithms , Interval Type-2 Fuzzy Logic, Fire Outbreak, Support Vector Machine, K-Nearest Neighbour, Principal Component Analysis

Procedia PDF Downloads 177
7846 Impact of Artificial Intelligence Technologies on Information-Seeking Behaviors and the Need for a New Information Seeking Model

Authors: Mohammed Nasser Al-Suqri

Abstract:

Former information-seeking models are proposed more than two decades ago. These already existed models were given prior to the evolution of digital information era and Artificial Intelligence (AI) technologies. Lack of current information seeking models within Library and Information Studies resulted in fewer advancements for teaching students about information-seeking behaviors, design of library tools and services. In order to better facilitate the aforementioned concerns, this study aims to propose state-of-the-art model while focusing on the information seeking behavior of library users in the Sultanate of Oman. This study aims for the development, designing and contextualizing the real-time user-centric information seeking model capable of enhancing information needs and information usage along with incorporating critical insights for the digital library practices. Another aim is to establish far-sighted and state-of-the-art frame of reference covering Artificial Intelligence (AI) while synthesizing digital resources and information for optimizing information-seeking behavior. The proposed study is empirically designed based on a mix-method process flow, technical surveys, in-depth interviews, focus groups evaluations and stakeholder investigations. The study data pool is consist of users and specialist LIS staff at 4 public libraries and 26 academic libraries in Oman. The designed research model is expected to facilitate LIS by assisting multi-dimensional insights with AI integration for redefining the information-seeking process, and developing a technology rich model.

Keywords: artificial intelligence, information seeking, information behavior, information seeking models, libraries, Sultanate of Oman

Procedia PDF Downloads 112
7845 Restricted Boltzmann Machines and Deep Belief Nets for Market Basket Analysis: Statistical Performance and Managerial Implications

Authors: H. Hruschka

Abstract:

This paper presents the first comparison of the performance of the restricted Boltzmann machine and the deep belief net on binary market basket data relative to binary factor analysis and the two best-known topic models, namely Dirichlet allocation and the correlated topic model. This comparison shows that the restricted Boltzmann machine and the deep belief net are superior to both binary factor analysis and topic models. Managerial implications that differ between the investigated models are treated as well. The restricted Boltzmann machine is defined as joint Boltzmann distribution of hidden variables and observed variables (purchases). It comprises one layer of observed variables and one layer of hidden variables. Note that variables of the same layer are not connected. The comparison also includes deep belief nets with three layers. The first layer is a restricted Boltzmann machine based on category purchases. Hidden variables of the first layer are used as input variables by the second-layer restricted Boltzmann machine which then generates second-layer hidden variables. Finally, in the third layer hidden variables are related to purchases. A public data set is analyzed which contains one month of real-world point-of-sale transactions in a typical local grocery outlet. It consists of 9,835 market baskets referring to 169 product categories. This data set is randomly split into two halves. One half is used for estimation, the other serves as holdout data. Each model is evaluated by the log likelihood for the holdout data. Performance of the topic models is disappointing as the holdout log likelihood of the correlated topic model – which is better than Dirichlet allocation - is lower by more than 25,000 compared to the best binary factor analysis model. On the other hand, binary factor analysis on its own is clearly surpassed by both the restricted Boltzmann machine and the deep belief net whose holdout log likelihoods are higher by more than 23,000. Overall, the deep belief net performs best. We also interpret hidden variables discovered by binary factor analysis, the restricted Boltzmann machine and the deep belief net. Hidden variables characterized by the product categories to which they are related differ strongly between these three models. To derive managerial implications we assess the effect of promoting each category on total basket size, i.e., the number of purchased product categories, due to each category's interdependence with all the other categories. The investigated models lead to very different implications as they disagree about which categories are associated with higher basket size increases due to a promotion. Of course, recommendations based on better performing models should be preferred. The impressive performance advantages of the restricted Boltzmann machine and the deep belief net suggest continuing research by appropriate extensions. To include predictors, especially marketing variables such as price, seems to be an obvious next step. It might also be feasible to take a more detailed perspective by considering purchases of brands instead of purchases of product categories.

Keywords: binary factor analysis, deep belief net, market basket analysis, restricted Boltzmann machine, topic models

Procedia PDF Downloads 194
7844 Elastoplastic and Ductile Damage Model Calibration of Steels for Bolt-Sphere Joints Used in China’s Space Structure Construction

Authors: Huijuan Liu, Fukun Li, Hao Yuan

Abstract:

The bolted spherical node is a common type of joint in space steel structures. The bolt-sphere joint portion almost always controls the bearing capacity of the bolted spherical node. The investigation of the bearing performance and progressive failure in service often requires high-fidelity numerical models. This paper focuses on the constitutive models of bolt steel and sphere steel used in China’s space structure construction. The elastoplastic model is determined by a standard tensile test and calibrated Voce saturated hardening rule. The ductile damage is found dominant based on the fractography analysis. Then Rice-Tracey ductile fracture rule is selected and the model parameters are calibrated based on tensile tests of notched specimens. These calibrated material models can benefit research or engineering work in similar fields.

Keywords: bolt-sphere joint, steel, constitutive model, ductile damage, model calibration

Procedia PDF Downloads 134
7843 Assessing the Impact of Covid-19 Pandemic on Waste Management Workers in Ghana

Authors: Mensah-Akoto Julius, Kenichi Matsui

Abstract:

This paper examines the impact of COVID-19 on waste management workers in Ghana. A questionnaire survey was conducted among 60 waste management workers in Accra metropolis, the capital region of Ghana, to understand the impact of the COVID-19 pandemic on waste generation, workers’ safety in collecting solid waste, and service delivery. To find out correlations between the pandemic and safety of waste management workers, a regression analysis was used. Regarding waste generation, the results show the pandemic led to the highest annual per capita solid waste generation, or 3,390 tons, in 2020. Regarding the safety of workers, the regression analysis shows a significant and inverse association between COVID-19 and waste management services. This means that contaminated wastes may infect field workers with COVID-19 due to their direct exposure. A rise in new infection cases would have a negative impact on the safety and service delivery of the workers. The result also shows that an increase in economic activities negatively impacts waste management workers. The analysis, however, finds no statistical relationship between workers’ service deliveries and employees’ salaries. The study then discusses how municipal waste management authorities can ensure safe and effective waste collection during the pandemic.

Keywords: Covid-19, waste management worker, waste collection, Ghana

Procedia PDF Downloads 195
7842 Modeling Core Flooding Experiments for Co₂ Geological Storage Applications

Authors: Avinoam Rabinovich

Abstract:

CO₂ geological storage is a proven technology for reducing anthropogenic carbon emissions, which is paramount for achieving the ambitious net zero emissions goal. Core flooding experiments are an important step in any CO₂ storage project, allowing us to gain information on the flow of CO₂ and brine in the porous rock extracted from the reservoir. This information is important for understanding basic mechanisms related to CO₂ geological storage as well as for reservoir modeling, which is an integral part of a field project. In this work, a different method for constructing accurate models of CO₂-brine core flooding will be presented. Results for synthetic cases and real experiments will be shown and compared with numerical models to exhibit their predictive capabilities. Furthermore, the various mechanisms which impact the CO₂ distribution and trapping in the rock samples will be discussed, and examples from models and experiments will be provided. The new method entails solving an inverse problem to obtain a three-dimensional permeability distribution which, along with the relative permeability and capillary pressure functions, constitutes a model of the flow experiments. The model is more accurate when data from a number of experiments are combined to solve the inverse problem. This model can then be used to test various other injection flow rates and fluid fractions which have not been tested in experiments. The models can also be used to bridge the gap between small-scale capillary heterogeneity effects (sub-core and core scale) and large-scale (reservoir scale) effects, known as the upscaling problem.

Keywords: CO₂ geological storage, residual trapping, capillary heterogeneity, core flooding, CO₂-brine flow

Procedia PDF Downloads 65
7841 Quantification of Glucosinolates in Turnip Greens and Turnip Tops by Near-Infrared Spectroscopy

Authors: S. Obregon-Cano, R. Moreno-Rojas, E. Cartea-Gonzalez, A. De Haro-Bailon

Abstract:

The potential of near-infrared spectroscopy (NIRS) for screening the total glucosinolate (t-GSL) content, and also, the aliphatic glucosinolates gluconapin (GNA), progoitrin (PRO) and glucobrassicanapin (GBN) in turnip greens and turnip tops was assessed. This crop is grown for edible leaves and stems for human consumption. The reference values for glucosinolates, as they were obtained by high performance liquid chromatography on the vegetable samples, were regressed against different spectral transformations by modified partial least-squares (MPLS) regression (calibration set of samples n= 350). The resulting models were satisfactory, with calibration coefficient values from 0.72 (GBN) to 0.98 (tGSL). The predictive ability of the equations obtained was tested using a set of samples (n=70) independent of the calibration set. The determination coefficients and prediction errors (SEP) obtained in the external validation were: GNA=0.94 (SEP=3.49); PRO=0.41 (SEP=1.08); GBN=0.55 (SEP=0.60); tGSL=0.96 (SEP=3.28). These results show that the equations developed for total glucosinolates, as well as for gluconapin can be used for screening these compounds in the leaves and stems of this species. In addition, the progoitrin and glucobrassicanapin equations obtained can be used to identify those samples with high, medium and low contents. The calibration equations obtained were accurate enough for a fast, non-destructive and reliable analysis of the content in GNA and tGSL directly from NIR spectra. The equations for PRO and GBN can be employed to identify samples with high, medium and low contents.

Keywords: brassica rapa, glucosinolates, gluconapin, NIRS, turnip greens

Procedia PDF Downloads 140
7840 An Investigation of the Relevant Factors of Unplanned Readmission within 14 Days of Discharge in a Regional Teaching Hospital in South Taiwan

Authors: Xuan Hua Huang, Shu Fen Wu, Yi Ting Huang, Pi Yueh Lee

Abstract:

Background: In Taiwan, the Taiwan healthcare care Indicator Series regards the rate of hospital readmission as an important indicator of healthcare quality. Unplanned readmission not only effects patient’s condition but also increase healthcare utilization rate and healthcare costs. Purpose: The purpose of this study was explored the effects of adult unplanned readmission within 14 days of discharge at a regional teaching hospital in South Taiwan. Methods: The retrospectively review design was used. A total 495 participants of unplanned readmissions and 878 of non-readmissions within 14 days recruited from a regional teaching hospital in Southern Taiwan. The instruments used included the Charlson Comorbidity Index, and demographic characteristics, and disease-related variables. Statistical analyses were performed with SPSS version 22.0. The descriptive statistics were used (means, standard deviations, and percentage) and the inferential statistics were used T-test, Chi-square test and Logistic regression. Results: The unplanned readmissions within 14 days rate was 36%. The majorities were 268 males (54.1%), aged >65 were 318 (64.2%), and mean age was 68.8±14.65 years (23-98years). The mean score for the comorbidities was 3.77±2.73. The top three diagnosed of the readmission were digestive diseases (32.7%), respiratory diseases (15.2%), and genitourinary diseases (10.5%). There were significant relationships among the gender, age, marriage, comorbidity status, and discharge planning services (χ2: 3.816-16.474, p: 0.051~0.000). Logistic regression analysis showed that old age (OR = 1.012, 95% CI: 1.003, 1.021), had the multi-morbidity (OR = 0.712~4.040, 95% CI: 0.559~8.522), had been consult with discharge planning services (OR = 1.696, 95% CI: 1.105, 2.061) have a higher risk of readmission. Conclusions: This study finds that multi-morbidity was independent risk factor for unplanned readmissions at 14 days, recommended that the interventional treatment of the medical team be provided to provide integrated care for multi-morbidity to improve the patient's self-care ability and reduce the 14-day unplanned readmission rate.

Keywords: unplanned readmission, comorbidities, Charlson comorbidity index, logistic regression

Procedia PDF Downloads 146