Search results for: statistical features
6373 The Influence of Size on Fused Silica Strength: A Multi-Method Study
Authors: Şeyma Saliha Fidan, Rahmi Ünal
Abstract:
Ceramic materials exhibit inherently brittle behavior, primarily attributed to the presence of flaws that severely restrict their applicability as structural elements under tensile loading. This brittleness necessitates special attention in the design of ceramic components, with a particular focus on appropriately addressing stress distribution. Among the most commonly used uniaxial testing methods to evaluate the mechanical behavior of ceramics are three-point bending and four-point bending tests. Each of these methods induces a unique stress distribution within the specimen. Using Weibull theory and its fundamental assumptions, it is possible to account for the different stress fields produced by each testing method and compare the resulting strength data. This comparison is based on the concept of effective volume or area. In this study, slip-cast fused silica ceramics were selected as the material of interest. The study aims to apply Weibull statistical theory to various testing methods, integrating statistical tools and finite element method (FEM) simulations. A validated FEM-based approach was developed to determine the effective volumes of the specimens. The effective volume values obtained through analytical and numerical methods were compared, and the stress fields generated by different testing methods were evaluated based on Weibull theory. Moreover, the effective volume calculation procedure derived from numerical analysis methods has been adapted for use in complex test geometries and various loading conditions.Keywords: ceramic, fused silica, effective volume, Weibull analysis, finite element method
Procedia PDF Downloads 76372 Evaluating the Factors Controlling the Hydrochemistry of Gaza Coastal Aquifer Using Hydrochemical and Multivariate Statistical Analysis
Authors: Madhat Abu Al-Naeem, Ismail Yusoff, Ng Tham Fatt, Yatimah Alias
Abstract:
Groundwater in Gaza strip is increasingly being exposed to anthropic and natural factors that seriously impacted the groundwater quality. Physiochemical data of groundwater can offer important information on changes in groundwater quality that can be useful in improving water management tactics. An integrative hydrochemical and statistical techniques (Hierarchical cluster analysis (HCA) and factor analysis (FA)) have been applied on the existence ten physiochemical data of 84 samples collected in (2000/2001) using STATA, AquaChem, and Surfer softwares to: 1) Provide valuable insight into the salinization sources and the hydrochemical processes controlling the chemistry of groundwater. 2) Differentiate the influence of natural processes and man-made activities. The recorded large diversity in water facies with dominance Na-Cl type that reveals a highly saline aquifer impacted by multiple complex hydrochemical processes. Based on WHO standards, only (15.5%) of the wells were suitable for drinking. HCA yielded three clusters. Cluster 1 is the highest in salinity, mainly due to the impact of Eocene saline water invasion mixed with human inputs. Cluster 2 is the lowest in salinity also due to Eocene saline water invasion but mixed with recent rainfall recharge and limited carbonate dissolution and nitrate pollution. Cluster 3 is similar in salinity to Cluster 2, but with a high diversity of facies due to the impact of many sources of salinity as sea water invasion, carbonate dissolution and human inputs. Factor analysis yielded two factors accounting for 88% of the total variance. Factor 1 (59%) is a salinization factor demonstrating the mixing contribution of natural saline water with human inputs. Factor 2 measure the hardness and pollution which explained 29% of the total variance. The negative relationship between the NO3- and pH may reveal a denitrification process in a heavy polluted aquifer recharged by a limited oxygenated rainfall. Multivariate statistical analysis combined with hydrochemical analysis indicate that the main factors controlling groundwater chemistry were Eocene saline invasion, seawater invasion, sewage invasion and rainfall recharge and the main hydrochemical processes were base ion and reverse ion exchange processes with clay minerals (water rock interactions), nitrification, carbonate dissolution and a limited denitrification process.Keywords: dendrogram and cluster analysis, water facies, Eocene saline invasion and sea water invasion, nitrification and denitrification
Procedia PDF Downloads 3656371 Specific Language Impirment in Kannada: Evidence Form a Morphologically Complex Language
Authors: Shivani Tiwari, Prathibha Karanth, B. Rajashekhar
Abstract:
Impairments of syntactic morphology are often considered central in children with Specific Language Impairment (SLI). In English and related languages, deficits of tense-related grammatical morphology could serve as a clinical marker of SLI. Yet, cross-linguistic studies on SLI in the recent past suggest that the nature and severity of morphosyntactic deficits in children with SLI varies with the language being investigated. Therefore, in the present study we investigated the morphosyntactic deficits in a group of children with SLI who speak Kannada, a morphologically complex Dravidian language spoken in Indian subcontinent. A group of 15 children with SLI participated in this study. Two more groups of typical developing children (15 each) matched for language and age to children with SLI, were included as control participants. All participants were assessed for morphosyntactic comprehension and expression using standardized language test and a spontaneous speech task. Results of the study showed that children with SLI differed significantly from age-matched but not language-matched control group, on tasks of both comprehension and expression of morphosyntax. This finding is, however, in contrast with the reports of English-speaking children with SLI who are reported to be poorer than younger MLU-matched children on tasks of morphosyntax. The observed difference in impairments of morphosyntax in Kannada-speaking children with SLI from English-speaking children with SLI is explained based on the morphological richness theory. The theory predicts that children with SLI perform relatively better in morphologically rich language due to occurrence of their frequent and consistent features that mark the morphological markers. The authors, therefore, conclude that language-specific features do influence manifestation of the disorder in children with SLI.Keywords: specific language impairment, morphosyntax, Kannada, manifestation
Procedia PDF Downloads 2456370 Effect of 3-Dimensional Knitted Spacer Fabrics Characteristics on Its Thermal and Compression Properties
Authors: Veerakumar Arumugam, Rajesh Mishra, Jiri Militky, Jana Salacova
Abstract:
The thermo-physiological comfort and compression properties of knitted spacer fabrics have been evaluated by varying the different spacer fabric parameters. Air permeability and water vapor transmission of the fabrics were measured using the Textest FX-3300 air permeability tester and PERMETEST. Then thermal behavior of fabrics was obtained by Thermal conductivity analyzer and overall moisture management capacity was evaluated by moisture management tester. Spacer Fabrics compression properties were also tested using Kawabata Evaluation System (KES-FB3). In the KES testing, the compression resilience, work of compression, linearity of compression and other parameters were calculated from the pressure-thickness curves. Analysis of Variance (ANOVA) was performed using new statistical software named QC expert trilobite and Darwin in order to compare the influence of different fabric parameters on thermo-physiological and compression behavior of samples. This study established that the raw materials, type of spacer yarn, density, thickness and tightness of surface layer have significant influence on both thermal conductivity and work of compression in spacer fabrics. The parameter which mainly influence on the water vapor permeability of these fabrics is the properties of raw material i.e. the wetting and wicking properties of fibers. The Pearson correlation between moisture capacity of the fabrics and water vapour permeability was found using statistical software named QC expert trilobite and Darwin. These findings are important requirements for the further designing of clothing for extreme environmental conditions.Keywords: 3D spacer fabrics, thermal conductivity, moisture management, work of compression (WC), resilience of compression (RC)
Procedia PDF Downloads 5466369 Integrated Geophysical Surveys for Sinkhole and Subsidence Vulnerability Assessment, in the West Rand Area of Johannesburg
Authors: Ramoshweu Melvin Sethobya, Emmanuel Chirenje, Mihlali Hobo, Simon Sebothoma
Abstract:
The recent surge in residential infrastructure development around the metropolitan areas of South Africa has necessitated conditions for thorough geotechnical assessments to be conducted prior to site developments to ensure human and infrastructure safety. This paper appraises the success in the application of multi-method geophysical techniques for the delineation of sinkhole vulnerability in a residential landscape. Geophysical techniques ERT, MASW, VES, Magnetics and gravity surveys were conducted to assist in mapping sinkhole vulnerability, using an existing sinkhole as a constraint at Venterspost town, West of Johannesburg city. A combination of different geophysical techniques and results integration from those proved to be useful in the delineation of the lithologic succession around sinkhole locality, and determining the geotechnical characteristics of each layer for its contribution to the development of sinkholes, subsidence and cavities at the vicinity of the site. Study results have also assisted in the determination of the possible depth extension of the currently existing sinkhole and the location of sites where other similar karstic features and sinkholes could form. Results of the ERT, VES and MASW surveys have uncovered dolomitic bedrock at varying depths around the sites, which exhibits high resistivity values in the range 2500-8000ohm.m and corresponding high velocities in the range 1000-2400 m/s. The dolomite layer was found to be overlain by a weathered chert-poor dolomite layer, which has resistivities between the range 250-2400ohm.m, and velocities ranging from 500-600m/s, from which the large sinkhole has been found to collapse/ cave in. A compiled 2.5D high resolution Shear Wave Velocity (Vs) map of the study area was created using 2D profiles of MASW data, offering insights into the prevailing lithological setup conducive for formation various types of karstic features around the site. 3D magnetic models of the site highlighted the regions of possible subsurface interconnections between the currently existing large sinkhole and the other subsidence feature at the site. A number of depth slices were used to detail the conditions near the sinkhole as depth increases. Gravity surveys results mapped the possible formational pathways for development of new karstic features around the site. Combination and correlation of different geophysical techniques proved useful in delineation of the site geotechnical characteristics and mapping the possible depth extend of the currently existing sinkhole.Keywords: resistivity, magnetics, sinkhole, gravity, karst, delineation, VES
Procedia PDF Downloads 816368 Prevalence of Human Papillomavirus in Squamous Intraepithelial Lesions and Cervical Cancer in Women of the North of Chihuahua, Mexico
Authors: Estefania Ponce-Amaya, Ana Lidia Arellano-Ortiz, Cecilia Diaz-Hernandez, Jose Alberto Lopez-Diaz, Antonio De La Mora-Covarrubias, Claudia Lucia Vargas-Requena, Mauricio Salcedo-Vargas, Florinda Jimenez-Vega
Abstract:
Cervical Cancer (CC) is the second leading cause of death among women worldwide and it had been associated with a persistent infection of human papillomavirus (HPV). The goal of the current study was to identify the prevalence of HPV infection in women with abnormal Pap smear who were attended at Dysplasia Clinic of Ciudad Juarez, Mexico. Methods: Cervical samples from 146 patients, who attended the Colposcopy Clinic at Sanitary Jurisdiction II of Cd Juarez, were collected for histopathology and molecular study. DNA was isolated for the HPV detection by Polymerase Chain Reaction (PCR) using MY09/011 and GP5/6 primers. The associated risk factors were assessed by a questionnaire. The statistical analysis was performed by ANOVA, using EpiINFO V7 software. Results: HPV infection was present in 142 patients (97.3 %). The prevalence of HPV infection was distributed in a 96% of all evaluated groups, low-grade squamous intraepithelial lesion (LSIL), high-grade squamous intraepithelial lesion (HISIL) and CC. We found a statistical significance (α = <0.05) between gestation and number of births as risk factors. The median values showed an ascending tend according with the lesion progression. However, CC showed a statistically significant difference with respect to the pre-carcinogenic stages. Conclusions: In these Mexican patients exists a high prevalence of HPV infection, and for that reason, we are studying the most prevalent HPV genotypes in this population.Keywords: cervical cancer, HPV, prevalence hpv, squamous intraepithelial lesion
Procedia PDF Downloads 3206367 Theoretical Investigations and Simulation of Electromagnetic Ion Cyclotron Waves in the Earth’s Magnetosphere Through Magnetospheric Multiscale Mission
Authors: A. A. Abid
Abstract:
Wave-particle interactions are considered to be the paramount in the transmission of energy in collisionless space plasmas, where electromagnetic fields confined the charged particles movement. One of the distinct features of energy transfer in collisionless plasma is wave-particle interaction which is ubiquitous in space plasmas. The three essential populations of the inner magnetosphere are cold plasmaspheric plasmas, ring-currents, and radiation belts high energy particles. The transition region amid such populations initiates wave-particle interactions among distinct plasmas and the wave mode perceived in the magnetosphere is the electromagnetic ion cyclotron (EMIC) wave. These waves can interact with numerous particle species resonantly, accompanied by plasma particle heating is still in debate. In this work we paid particular attention to how EMIC waves impact plasma species, specifically how they affect the heating of electrons and ions during storm and substorm in the Magnetosphere. Using Magnetospheric Multiscale (MMS) mission and electromagnetic hybrid simulation, this project will investigate the energy transfer mechanism (e.g., Landau interactions, bounce resonance interaction, cyclotron resonance interaction, etc.) between EMIC waves and cold-warm plasma populations. Other features such as the production of EMIC waves and the importance of cold plasma particles in EMIC wave-particle interactions will also be worth exploring. Wave particle interactions, electromagnetic hybrid simulation, electromagnetic ion cyclotron (EMIC) waves, Magnetospheric Multiscale (MMS) mission, space plasmas, inner magnetosphereKeywords: MMS, magnetosphere, wave particle interraction, non-maxwellian distribution
Procedia PDF Downloads 636366 The Grammar of the Content Plane as a Style Marker in Forensic Authorship Attribution
Authors: Dayane de Almeida
Abstract:
This work aims at presenting a study that demonstrates the usability of categories of analysis from Discourse Semiotics – also known as Greimassian Semiotics in authorship cases in forensic contexts. It is necessary to know if the categories examined in semiotic analysis (the ‘grammar’ of the content plane) can distinguish authors. Thus, a study with 4 sets of texts from a corpus of ‘not on demand’ written samples (those texts differ in formality degree, purpose, addressees, themes, etc.) was performed. Each author contributed with 20 texts, separated into 2 groups of 10 (Author1A, Author1B, and so on). The hypothesis was that texts from a single author were semiotically more similar to each other than texts from different authors. The assumptions and issues that led to this idea are as follows: -The features analyzed in authorship studies mostly relate to the expression plane: they are manifested on the ‘surface’ of texts. If language is both expression and content, content would also have to be considered for more accurate results. Style is present in both planes. -Semiotics postulates the content plane is structured in a ‘grammar’ that underlies expression, and that presents different levels of abstraction. This ‘grammar’ would be a style marker. -Sociolinguistics demonstrates intra-speaker variation: an individual employs different linguistic uses in different situations. Then, how to determine if someone is the author of several texts, distinct in nature (as it is the case in most forensic sets), when it is known intra-speaker variation is dependent on so many factors?-The idea is that the more abstract the level in the content plane, the lower the intra-speaker variation, because there will be a greater chance for the author to choose the same thing. If two authors recurrently chose the same options, differently from one another, it means each one’s option has discriminatory power. -Size is another issue for various attribution methods. Since most texts in real forensic settings are short, methods relying only on the expression plane tend to fail. The analysis of the content plane as proposed by greimassian semiotics would be less size-dependable. -The semiotic analysis was performed using the software Corpus Tool, generating tags to allow the counting of data. Then, similarities and differences were quantitatively measured, through the application of the Jaccard coefficient (a statistical measure that compares the similarities and differences between samples). The results showed the hypothesis was confirmed and, hence, the grammatical categories of the content plane may successfully be used in questioned authorship scenarios.Keywords: authorship attribution, content plane, forensic linguistics, greimassian semiotics, intraspeaker variation, style
Procedia PDF Downloads 2436365 A Survey of Skin Cancer Detection and Classification from Skin Lesion Images Using Deep Learning
Authors: Joseph George, Anne Kotteswara Roa
Abstract:
Skin disease is one of the most common and popular kinds of health issues faced by people nowadays. Skin cancer (SC) is one among them, and its detection relies on the skin biopsy outputs and the expertise of the doctors, but it consumes more time and some inaccurate results. At the early stage, skin cancer detection is a challenging task, and it easily spreads to the whole body and leads to an increase in the mortality rate. Skin cancer is curable when it is detected at an early stage. In order to classify correct and accurate skin cancer, the critical task is skin cancer identification and classification, and it is more based on the cancer disease features such as shape, size, color, symmetry and etc. More similar characteristics are present in many skin diseases; hence it makes it a challenging issue to select important features from a skin cancer dataset images. Hence, the skin cancer diagnostic accuracy is improved by requiring an automated skin cancer detection and classification framework; thereby, the human expert’s scarcity is handled. Recently, the deep learning techniques like Convolutional neural network (CNN), Deep belief neural network (DBN), Artificial neural network (ANN), Recurrent neural network (RNN), and Long and short term memory (LSTM) have been widely used for the identification and classification of skin cancers. This survey reviews different DL techniques for skin cancer identification and classification. The performance metrics such as precision, recall, accuracy, sensitivity, specificity, and F-measures are used to evaluate the effectiveness of SC identification using DL techniques. By using these DL techniques, the classification accuracy increases along with the mitigation of computational complexities and time consumption.Keywords: skin cancer, deep learning, performance measures, accuracy, datasets
Procedia PDF Downloads 1326364 The Effect of Corporate Governance on Financial Stability and Solvency Margin for Insurance Companies in Jordan
Authors: Ghadeer A.Al-Jabaree, Husam Aldeen Al-Khadash, M. Nassar
Abstract:
This study aimed at investigating the effect of well-designed corporate governance system on the financial stability of insurance companies listed in ASE. Further, this study provides a comprehensive model for evaluating and analyzing insurance companies' financial position and prospective for comparing the degree of corporate governance application provisions among Jordanian insurance companies. In order to achieve the goals of the study, a whole population that consist of (27) listed insurance companies was introduced through the variables of (board of director, audit committee, internal and external auditor, board and management ownership and block holder's identities). Statistical methods were used with alternative techniques by (SPSS); where descriptive statistical techniques such as means, standard deviations were used to describe the variables, while (F) test and ANOVA analysis of variance were used to test the hypotheses of the study. The study revealed the existence of significant effect of corporate governance variables except local companies that are not listed in ASE on financial stability within control variables especially debt ratio (leverage),where it's also showed that concentration in motor third party doesn't have significant effect on insurance companies' financial stability during study period. Moreover, the study concludes that Global financial crisis affect the investment side of insurance companies with insignificant effect on the technical side. Finally, some recommendations were presented such as enhancing the laws and regulation that help the appropriate application of corporate governance, and work on activating the transparency in the disclosures of the financial statements and focusing on supporting the technical provisions for the companies, rather than focusing only on profit side.Keywords: corporate governance, financial stability and solvency margin, insurance companies, Jordan
Procedia PDF Downloads 4906363 Regional Hydrological Extremes Frequency Analysis Based on Statistical and Hydrological Models
Authors: Hadush Kidane Meresa
Abstract:
The hydrological extremes frequency analysis is the foundation for the hydraulic engineering design, flood protection, drought management and water resources management and planning to utilize the available water resource to meet the desired objectives of different organizations and sectors in a country. This spatial variation of the statistical characteristics of the extreme flood and drought events are key practice for regional flood and drought analysis and mitigation management. For different hydro-climate of the regions, where the data set is short, scarcity, poor quality and insufficient, the regionalization methods are applied to transfer at-site data to a region. This study aims in regional high and low flow frequency analysis for Poland River Basins. Due to high frequent occurring of hydrological extremes in the region and rapid water resources development in this basin have caused serious concerns over the flood and drought magnitude and frequencies of the river in Poland. The magnitude and frequency result of high and low flows in the basin is needed for flood and drought planning, management and protection at present and future. Hydrological homogeneous high and low flow regions are formed by the cluster analysis of site characteristics, using the hierarchical and C- mean clustering and PCA method. Statistical tests for regional homogeneity are utilized, by Discordancy and Heterogeneity measure tests. In compliance with results of the tests, the region river basin has been divided into ten homogeneous regions. In this study, frequency analysis of high and low flows using AM for high flow and 7-day minimum low flow series is conducted using six statistical distributions. The use of L-moment and LL-moment method showed a homogeneous region over entire province with Generalized logistic (GLOG), Generalized extreme value (GEV), Pearson type III (P-III), Generalized Pareto (GPAR), Weibull (WEI) and Power (PR) distributions as the regional drought and flood frequency distributions. The 95% percentile and Flow duration curves of 1, 7, 10, 30 days have been plotted for 10 stations. However, the cluster analysis performed two regions in west and east of the province where L-moment and LL-moment method demonstrated the homogeneity of the regions and GLOG and Pearson Type III (PIII) distributions as regional frequency distributions for each region, respectively. The spatial variation and regional frequency distribution of flood and drought characteristics for 10 best catchment from the whole region was selected and beside the main variable (streamflow: high and low) we used variables which are more related to physiographic and drainage characteristics for identify and delineate homogeneous pools and to derive best regression models for ungauged sites. Those are mean annual rainfall, seasonal flow, average slope, NDVI, aspect, flow length, flow direction, maximum soil moisture, elevation, and drainage order. The regional high-flow or low-flow relationship among one streamflow characteristics with (AM or 7-day mean annual low flows) some basin characteristics is developed using Generalized Linear Mixed Model (GLMM) and Generalized Least Square (GLS) regression model, providing a simple and effective method for estimation of flood and drought of desired return periods for ungauged catchments.Keywords: flood , drought, frequency, magnitude, regionalization, stochastic, ungauged, Poland
Procedia PDF Downloads 6036362 Ontology-Driven Knowledge Discovery and Validation from Admission Databases: A Structural Causal Model Approach for Polytechnic Education in Nigeria
Authors: Bernard Igoche Igoche, Olumuyiwa Matthew, Peter Bednar, Alexander Gegov
Abstract:
This study presents an ontology-driven approach for knowledge discovery and validation from admission databases in Nigerian polytechnic institutions. The research aims to address the challenges of extracting meaningful insights from vast amounts of admission data and utilizing them for decision-making and process improvement. The proposed methodology combines the knowledge discovery in databases (KDD) process with a structural causal model (SCM) ontological framework. The admission database of Benue State Polytechnic Ugbokolo (Benpoly) is used as a case study. The KDD process is employed to mine and distill knowledge from the database, while the SCM ontology is designed to identify and validate the important features of the admission process. The SCM validation is performed using the conditional independence test (CIT) criteria, and an algorithm is developed to implement the validation process. The identified features are then used for machine learning (ML) modeling and prediction of admission status. The results demonstrate the adequacy of the SCM ontological framework in representing the admission process and the high predictive accuracies achieved by the ML models, with k-nearest neighbors (KNN) and support vector machine (SVM) achieving 92% accuracy. The study concludes that the proposed ontology-driven approach contributes to the advancement of educational data mining and provides a foundation for future research in this domain.Keywords: admission databases, educational data mining, machine learning, ontology-driven knowledge discovery, polytechnic education, structural causal model
Procedia PDF Downloads 676361 Design an Assessment Model of Research and Development Capabilities with the New Product Development Approach: A Case Study of Iran Khodro Company
Authors: Hamid Hanifi, Adel Azar, Alireza Booshehri
Abstract:
In order to know about the capability level of R & D units in automotive industry, it is essential that organizations always compare themselves with standard level and higher than themselves so that to be improved continuously. In this research, with respect to the importance of this issue, we have tried to present an assessment model for R & D capabilities having reviewed on new products development in automotive industry of Iran. Iran Khodro Company was selected for the case study. To this purpose, first, having a review on the literature, about 200 indicators effective in R & D capabilities and new products development were extracted. Then, of these numbers, 29 indicators which were more important were selected by industry and academia experts and the questionnaire was distributed among statistical population. Statistical population was consisted of 410 individuals in Iran Khodro Company. We used the 410 questionnaires for exploratory factor analysis and then used the data of 308 questionnaires from the same population randomly for confirmatory factor analysis. The results of exploratory factor analysis led to categorization of dimensions in 9 secondary dimensions. Naming the dimensions was done according to a literature review and the professors’ opinion. Using structural equation modeling and AMOS software, confirmatory factor analysis was conducted and ultimate model with 9 secondary dimensions was confirmed. Meanwhile, 9 secondary dimensions of this research are as follows: 1) Research and design capability, 2) Customer and market capability, 3) Technology capability, 4) Financial resources capability, 5) Organizational chart, 6) Intellectual capital capability, 7) NPD process capability, 8) Managerial capability and 9) Strategy capability.Keywords: research and development, new products development, structural equations, exploratory factor analysis, confirmatory factor analysis
Procedia PDF Downloads 3426360 Micro-Meso 3D FE Damage Modelling of Woven Carbon Fibre Reinforced Plastic Composite under Quasi-Static Bending
Authors: Aamir Mubashar, Ibrahim Fiaz
Abstract:
This research presents a three-dimensional finite element modelling strategy to simulate damage in a quasi-static three-point bending analysis of woven twill 2/2 type carbon fibre reinforced plastic (CFRP) composite on a micro-meso level using cohesive zone modelling technique. A meso scale finite element model comprised of a number of plies was developed in the commercial finite element code Abaqus/explicit. The interfaces between the plies were explicitly modelled using cohesive zone elements to allow for debonding by crack initiation and propagation. Load-deflection response of the CRFP within the quasi-static range was obtained and compared with the data existing in the literature. This provided validation of the model at the global scale. The outputs resulting from the global model were then used to develop a simulation model capturing the micro-meso scale material features. The sub-model consisted of a refined mesh representative volume element (RVE) modelled in texgen software, which was later embedded with cohesive elements in the finite element software environment. The results obtained from the developed strategy were successful in predicting the overall load-deflection response and the damage in global and sub-model at the flexure limit of the specimen. Detailed analysis of the effects of the micro-scale features was carried out.Keywords: woven composites, multi-scale modelling, cohesive zone, finite element model
Procedia PDF Downloads 1406359 Quality of Care of Medical Male Circumcisions: A Non-Negotiable for Right to Care
Authors: Nelson Igaba, C. Onaga, S. Hlongwane
Abstract:
Background: Medical Male Circumcision (MMC) is part of a comprehensive HIV prevention strategy. The quality of MMC done at Right To Care (RtC) sites is maintained by Continuous Quality Improvement (CQI) based on findings of assessments by internal and independent external assessors who evaluate such parameters as the quality of the surgical procedure, infection control, etc. There are 12 RtC MMC teams in Mpumalanga, two of which are headed by Medical Officers and 10 by Clinical Associates (Clin A). Objectives: To compare the quality (i) of care rendered at doctor headed sites (DHS) versus Clin A headed sites (CHS); (ii) of CQI assessments (external versus internal). Methodology: A retrospective review of data from RightMax™ (a novel RtC data management system) and CQI reports (external and internal) was done. CQI assessment scores of October 2015 and October 2016 were taken as the baseline and latest respectively. Four sites with 745-810 circumcisions per annum were purposively selected; the two DHS (group A) and two CHS (group B). Statistical analyses were conducted using R (2017 version). Results: There were no significant difference in latest CQI scores between the two groups (DHS and CHS) (Anova, F = 1.97, df = 1, P = 0.165); between internal and external CQI assessment scores (Anova, F = 2.251, df = 1, P = 0.139) or among the individual sites (Anova, F = 1.095, df = 2, P = 0.341). Of the total of 16 adverse events reported by the four sites in the 12 months reviewed (all were infections), there was no statistical evidence that the documented severity of the infection was different for DHS and CHS (Fisher’s exact test, p-value = 0.269). Conclusion: At RtC VMMC sites in Mpumalanga, internal and external/independent CQI assessments are comparable, and quality of care of VMMC is standardized with the performance of well-supervised clinical associates comparing well with those of medical officers.Keywords: adverse events, Right to Care, male medical circumcision, continuous quality improvement
Procedia PDF Downloads 1786358 Library Support for the Intellectually Disabled: Book Clubs and Universal Design
Authors: Matthew Conner, Leah Plocharczyk
Abstract:
This study examines the role of academic libraries in support of the intellectually disabled (ID) in post-secondary education. With the growing public awareness of the ID, there has been recognition of their need for post-secondary educational opportunities. This was an unforeseen result for a population that has been associated with elementary levels of education, yet the reasons are compelling. After aging out of the school system, the ID need and deserve educational and social support as much as anyone. Moreover, the commitment to diversity in higher education rings hollow if this group is excluded. Yet, challenges remain to integrating the ID into a college curriculum. This presentation focuses on the role of academic libraries. Neglecting this vital resource for the support of the ID is not to be thought of, yet the library’s contribution is not clear. Library collections presume reading ability and libraries already struggle to meet their traditional goals with the resources available. This presentation examines how academic libraries can support post-secondary ID. For context, the presentation first examines the state of post-secondary education for the ID with an analysis of data on the United States compiled by the ThinkCollege! Project. Geographic Information Systems (GIS) and statistical analysis will show regional and methodological trends in post-secondary support of the ID which currently lack any significant involvement by college libraries. Then, the presentation analyzes a case study of a book club at the Florida Atlantic University (FAU) libraries which has run for several years. Issues such as the selection of books, effective pedagogies, and evaluation procedures will be examined. The study has found that the instruction pedagogies used by libraries can be extended through concepts of Universal Learning Design (ULD) to effectively engage the ID. In particular, student-centered, participatory methodologies that accommodate different learning styles have proven to be especially useful. The choice of text is complex and determined not only by reading ability but familiarity of subject and features of the ID’s developmental trajectory. The selection of text is not only a necessity but also promises to give insight into the ID. Assessment remains a complex and unresolved subject, but the voluntary, sustained, and enthusiastic attendance of the ID is an undeniable indicator. The study finds that, through the traditional library vehicle of the book club, academic libraries can support ID students through training in both reading and socialization, two major goals of their post-secondary education.Keywords: academic libraries, intellectual disability, literacy, post-secondary education
Procedia PDF Downloads 1646357 A Corpus-Based Study on the Lexical, Syntactic and Sequential Features across Interpreting Types
Authors: Qianxi Lv, Junying Liang
Abstract:
Among the various modes of interpreting, simultaneous interpreting (SI) is regarded as a ‘complex’ and ‘extreme condition’ of cognitive tasks while consecutive interpreters (CI) do not have to share processing capacity between tasks. Given that SI exerts great cognitive demand, it makes sense to posit that the output of SI may be more compromised than that of CI in the linguistic features. The bulk of the research has stressed the varying cognitive demand and processes involved in different modes of interpreting; however, related empirical research is sparse. In keeping with our interest in investigating the quantitative linguistic factors discriminating between SI and CI, the current study seeks to examine the potential lexical simplification, syntactic complexity and sequential organization mechanism with a self-made inter-model corpus of transcribed simultaneous and consecutive interpretation, translated speech and original speech texts with a total running word of 321960. The lexical features are extracted in terms of the lexical density, list head coverage, hapax legomena, and type-token ratio, as well as core vocabulary percentage. Dependency distance, an index for syntactic complexity and reflective of processing demand is employed. Frequency motif is a non-grammatically-bound sequential unit and is also used to visualize the local function distribution of interpreting the output. While SI is generally regarded as multitasking with high cognitive load, our findings evidently show that CI may impose heavier or taxing cognitive resource differently and hence yields more lexically and syntactically simplified output. In addition, the sequential features manifest that SI and CI organize the sequences from the source text in different ways into the output, to minimize the cognitive load respectively. We reasoned the results in the framework that cognitive demand is exerted both on maintaining and coordinating component of Working Memory. On the one hand, the information maintained in CI is inherently larger in volume compared to SI. On the other hand, time constraints directly influence the sentence reformulation process. The temporal pressure from the input in SI makes the interpreters only keep a small chunk of information in the focus of attention. Thus, SI interpreters usually produce the output by largely retaining the source structure so as to relieve the information from the working memory immediately after formulated in the target language. Conversely, CI interpreters receive at least a few sentences before reformulation, when they are more self-paced. CI interpreters may thus tend to retain and generate the information in a way to lessen the demand. In other words, interpreters cope with the high demand in the reformulation phase of CI by generating output with densely distributed function words, more content words of higher frequency values and fewer variations, simpler structures and more frequently used language sequences. We consequently propose a revised effort model based on the result for a better illustration of cognitive demand during both interpreting types.Keywords: cognitive demand, corpus-based, dependency distance, frequency motif, interpreting types, lexical simplification, sequential units distribution, syntactic complexity
Procedia PDF Downloads 1816356 Modelling the Behavior of Commercial and Test Textiles against Laundering Process by Statistical Assessment of Their Performance
Authors: M. H. Arslan, U. K. Sahin, H. Acikgoz-Tufan, I. Gocek, I. Erdem
Abstract:
Various exterior factors have perpetual effects on textile materials during wear, use and laundering in everyday life. In accordance with their frequency of use, textile materials are required to be laundered at certain intervals. The medium in which the laundering process takes place have inevitable detrimental physical and chemical effects on textile materials caused by the unique parameters of the process inherently existing. Connatural structures of various textile materials result in many different physical, chemical and mechanical characteristics. Because of their specific structures, these materials have different behaviors against several exterior factors. By modeling the behavior of commercial and test textiles as group-wise against laundering process, it is possible to disclose the relation in between these two groups of materials, which will lead to better understanding of their behaviors in terms of similarities and differences against the washing parameters of the laundering. Thus, the goal of the current research is to examine the behavior of two groups of textile materials as commercial textiles and as test textiles towards the main washing machine parameters during laundering process such as temperature, load quantity, mechanical action and level of water amount by concentrating on shrinkage, pilling, sewing defects, collar abrasion, the other defects other than sewing, whitening and overall properties of textiles. In this study, cotton fabrics were preferred as commercial textiles due to the fact that garments made of cotton are the most demanded products in the market by the textile consumers in daily life. Full factorial experimental set-up was used to design the experimental procedure. All profiles always including all of the commercial and the test textiles were laundered for 20 cycles by commercial home laundering machine to investigate the effects of the chosen parameters. For the laundering process, a modified version of ‘‘IEC 60456 Test Method’’ was utilized. The amount of detergent was altered as 0.5% gram per liter depending on varying load quantity levels. Datacolor 650®, EMPA Photographic Standards for Pilling Test and visual examination were utilized to test and characterize the textiles. Furthermore, in the current study the relation in between commercial and test textiles in terms of their performance was deeply investigated by the help of statistical analysis performed by MINITAB® package program modeling their behavior against the parameters of the laundering process. In the experimental work, the behaviors of both groups of textiles towards washing machine parameters were visually and quantitatively assessed in dry state.Keywords: behavior against washing machine parameters, performance evaluation of textiles, statistical analysis, commercial and test textiles
Procedia PDF Downloads 3626355 Investigating Complement Clause Choice in Written Educated Nigerian English (ENE)
Authors: Juliet Udoudom
Abstract:
Inappropriate complement selection constitutes one of the major features of non-standard complementation in the Nigerian users of English output of sentence construction. This paper investigates complement clause choice in Written Educated Nigerian English (ENE) and offers some results. It aims at determining preferred and dispreferred patterns of complement clause selection in respect of verb heads in English by selected Nigerian users of English. The complementation data analyzed in this investigation were obtained from experimental tasks designed to elicit complement categories of Verb – Noun -, Adjective – and Prepositional – heads in English. Insights from the Government – Binding relations were employed in analyzing data, which comprised responses obtained from one hundred subjects to a picture elicitation exercise, a grammaticality judgement test, and a free composition task. The findings indicate a general tendency for clausal complements (CPs) introduced by the complementizer that to be preferred by the subjects studied. Of the 235 tokens of clausal complements which occurred in our corpus, 128 of them representing 54.46% were CPs headed by that, while whether – and if-clauses recorded 31.07% and 8.94%, respectively. The complement clause-type which recorded the lowest incidence of choice was the CP headed by the Complementiser, for with a 5.53% incident of occurrence. Further findings from the study indicate that semantic features of relevant embedding verb heads were not taken into consideration in the choice of complementisers which introduce the respective complement clauses, hence the that-clause was chosen to complement verbs like prefer. In addition, the dispreferred choice of the for-clause is explicable in terms of the fact that the respondents studied regard ‘for’ as a preposition, and not a complementiser.Keywords: complement, complement clause complement selection, complementisers, government-binding
Procedia PDF Downloads 1886354 The Per Capita Income, Energy production and Environmental Degradation: A Comprehensive Assessment of the existence of the Environmental Kuznets Curve Hypothesis in Bangladesh
Authors: Ashique Mahmud, MD. Ataul Gani Osmani, Shoria Sharmin
Abstract:
In the first quarter of the twenty-first century, the most substantial global concern is environmental contamination, and it has gained the prioritization of both the national and international community. Keeping in mind this crucial fact, this study conducted different statistical and econometrical methods to identify whether the gross national income of the country has a significant impact on electricity production from nonrenewable sources and different air pollutants like carbon dioxide, nitrous oxide, and methane emissions. Besides, the primary objective of this research was to analyze whether the environmental Kuznets curve hypothesis holds for the examined variables. After analyzing different statistical properties of the variables, this study came to the conclusion that the environmental Kuznets curve hypothesis holds for gross national income and carbon dioxide emission in Bangladesh in the short run as well as the long run. This study comes to this conclusion based on the findings of ordinary least square estimations, ARDL bound tests, short-run causality analysis, the Error Correction Model, and other pre-diagnostic and post-diagnostic tests that have been employed in the structural model. Moreover, this study wants to demonstrate that the outline of gross national income and carbon dioxide emissions is in its initial stage of development and will increase up to the optimal peak. The compositional effect will then force the emission to decrease, and the environmental quality will be restored in the long run.Keywords: environmental Kuznets curve hypothesis, carbon dioxide emission in Bangladesh, gross national income in Bangladesh, autoregressive distributed lag model, granger causality, error correction model
Procedia PDF Downloads 1506353 Analysis of Kilistra (Gokyurt) Settlement within the Context of Traditional Residential Architecture
Authors: Esra Yaldız, Tugba Bulbul Bahtiyar, Dicle Aydın
Abstract:
Humans meet their need for shelter via housing which they structure in line with habits and necessities. In housing culture, traditional dwelling has an important role as a social and cultural transmitter. It provides concrete data by being planned in parallel with users’ life style and habits, having their own dynamics and components as well as their designs in harmony with nature, environment and the context they exist. Textures of traditional dwelling create a healthy and cozy living environment by means of adaptation to natural conditions, topography, climate, and context; utilization of construction materials found nearby and usage of traditional techniques and forms; and natural isolation of construction materials used. One of the examples of traditional settlements in Anatolia is Kilistra (Gökyurt) settlement of Konya province. Being among the important centers of Christianity in the past, besides having distinctive architecture, culture, natural features, and geographical differences (climate, geological structure, material), Kilistra can also be identified as a traditional settlement consisting of family, religious and economic structures as well as cultural interaction. The foundation of this study is the traditional residential texture of Kilistra with its unique features. The objective of this study is to assess the conformity of traditional residential texture of Kilistra with present topography, climatic data, and geographical values within the context of human scale construction, usage of green space, indigenous construction materials, construction form, building envelope, and space organization in housing.Keywords: traditional residential architecture, Kilistra, Anatolia, Konya
Procedia PDF Downloads 4146352 Automatic Content Curation of Visual Heritage
Authors: Delphine Ribes Lemay, Valentine Bernasconi, André Andrade, Lara DéFayes, Mathieu Salzmann, FréDéRic Kaplan, Nicolas Henchoz
Abstract:
Digitization and preservation of large heritage induce high maintenance costs to keep up with the technical standards and ensure sustainable access. Creating impactful usage is instrumental to justify the resources for long-term preservation. The Museum für Gestaltung of Zurich holds one of the biggest poster collections of the world from which 52’000 were digitised. In the process of building a digital installation to valorize the collection, one objective was to develop an algorithm capable of predicting the next poster to show according to the ones already displayed. The work presented here describes the steps to build an algorithm able to automatically create sequences of posters reflecting associations performed by curator and professional designers. The exposed challenge finds similarities with the domain of song playlist algorithms. Recently, artificial intelligence techniques and more specifically, deep-learning algorithms have been used to facilitate their generations. Promising results were found thanks to Recurrent Neural Networks (RNN) trained on manually generated playlist and paired with clusters of extracted features from songs. We used the same principles to create the proposed algorithm but applied to a challenging medium, posters. First, a convolutional autoencoder was trained to extract features of the posters. The 52’000 digital posters were used as a training set. Poster features were then clustered. Next, an RNN learned to predict the next cluster according to the previous ones. RNN training set was composed of poster sequences extracted from a collection of books from the Gestaltung Museum of Zurich dedicated to displaying posters. Finally, within the predicted cluster, the poster with the best proximity compared to the previous poster is selected. The mean square distance between features of posters was used to compute the proximity. To validate the predictive model, we compared sequences of 15 posters produced by our model to randomly and manually generated sequences. Manual sequences were created by a professional graphic designer. We asked 21 participants working as professional graphic designers to sort the sequences from the one with the strongest graphic line to the one with the weakest and to motivate their answer with a short description. The sequences produced by the designer were ranked first 60%, second 25% and third 15% of the time. The sequences produced by our predictive model were ranked first 25%, second 45% and third 30% of the time. The sequences produced randomly were ranked first 15%, second 29%, and third 55% of the time. Compared to designer sequences, and as reported by participants, model and random sequences lacked thematic continuity. According to the results, the proposed model is able to generate better poster sequencing compared to random sampling. Eventually, our algorithm is sometimes able to outperform a professional designer. As a next step, the proposed algorithm should include a possibility to create sequences according to a selected theme. To conclude, this work shows the potentiality of artificial intelligence techniques to learn from existing content and provide a tool to curate large sets of data, with a permanent renewal of the presented content.Keywords: Artificial Intelligence, Digital Humanities, serendipity, design research
Procedia PDF Downloads 1866351 The Role of Creative Entrepreneurship in the Development of Croatian Economy
Authors: Marko Kolakovic
Abstract:
Creative industries are an important sector of growth and development of knowledge economies. They have a positive impact on employment, economic growth, export and the quality of life in the areas where they are developed. Creative sectors include architecture, design, advertising, publishing, music, film, television and radio, video games, visual and performing arts and heritage. Following the positive trends of development of creative industries on the global and European level, this paper analyzes creative industries in general and specific characteristics of creative entrepreneurship. Special focus in this paper is put on the influence of the information communication technology on the development of new creative business models and protection of the intellectual property rights. One part of the paper is oriented on the analysis of the status of creative industries and creative entrepreneurship in Croatia. The main objective of the paper is by using the statistical analysis of creative industries in Croatia and information gained during the interviews with entrepreneurs, to make conclusions about potentials and development of creative industries in Croatia. Creative industries in Croatia are at the beginning of their development and growth strategy still does not exist at the national level. Statistical analysis pointed out that in 2015 creative enterprises made 9% of all enterprises in Croatia, employed 5,5% of employed people and their share in GDP was 4,01%. Croatian creative entrepreneurs are building competitive advantage using their creative resources and creating specific business models. The main obstacles they meet are lack of business experience and impossibility of focusing on the creative activities only. In their business, they use digital technologies and are focused on export. The conclusion is that creative industries in Croatia have development potential, but it is necessary to take adequate measures to use this potential in a right way.Keywords: creative entrepreneurship, knowledge economy, business models, intellectual property
Procedia PDF Downloads 2106350 Assessment of the Effects of Urban Development on Urban Heat Islands and Community Perception in Semi-Arid Climates: Integrating Remote Sensing, GIS Tools, and Social Analysis - A Case Study of the Aures Region (Khanchela), Algeria
Authors: Amina Naidja, Zedira Khammar, Ines Soltani
Abstract:
This study investigates the impact of urban development on the urban heat island (UHI) effect in the semi-arid Aures region of Algeria, integrating remote sensing data with statistical analysis and community surveys to examine the interconnected environmental and social dynamics. Using Landsat 8 satellite imagery, temporal variations in the Normalized Difference Vegetation Index (NDVI), Normalized Difference Built-up Index (NDBI), and land use/land cover (LULC) changes are analyzed to understand patterns of urbanization and environmental transformation. These environmental metrics are correlated with land surface temperature (LST) data derived from remote sensing to quantify the UHI effect. To incorporate the social dimension, a structured questionnaire survey is conducted among residents in selected urban areas. The survey assesses community perceptions of urban heat, its impacts on daily life, health concerns, and coping strategies. Statistical analysis is employed to analyze survey responses, identifying correlations between demographic factors, socioeconomic status, and perceived heat stress. Preliminary findings reveal significant correlations between built-up areas (NDBI) and higher LST, indicating the contribution of urbanization to local warming. Conversely, areas with higher vegetation cover (NDVI) exhibit lower LST, highlighting the cooling effect of green spaces. Social survey results provide insights into how UHI affects different demographic groups, with vulnerable populations experiencing greater heat-related challenges. By integrating remote sensing analysis with statistical modeling and community surveys, this study offers a comprehensive understanding of the environmental and social implications of urban development in semi-arid climates. The findings contribute to evidence-based urban planning strategies that prioritize environmental sustainability and social well-being. Future research should focus on policy recommendations and community engagement initiatives to mitigate UHI impacts and promote climate-resilient urban development.Keywords: urban heat island, remote sensing, social analysis, NDVI, NDBI, LST, community perception
Procedia PDF Downloads 436349 Comparison of Different Machine Learning Algorithms for Solubility Prediction
Authors: Muhammet Baldan, Emel Timuçin
Abstract:
Molecular solubility prediction plays a crucial role in various fields, such as drug discovery, environmental science, and material science. In this study, we compare the performance of five machine learning algorithms—linear regression, support vector machines (SVM), random forests, gradient boosting machines (GBM), and neural networks—for predicting molecular solubility using the AqSolDB dataset. The dataset consists of 9981 data points with their corresponding solubility values. MACCS keys (166 bits), RDKit properties (20 properties), and structural properties(3) features are extracted for every smile representation in the dataset. A total of 189 features were used for training and testing for every molecule. Each algorithm is trained on a subset of the dataset and evaluated using metrics accuracy scores. Additionally, computational time for training and testing is recorded to assess the efficiency of each algorithm. Our results demonstrate that random forest model outperformed other algorithms in terms of predictive accuracy, achieving an 0.93 accuracy score. Gradient boosting machines and neural networks also exhibit strong performance, closely followed by support vector machines. Linear regression, while simpler in nature, demonstrates competitive performance but with slightly higher errors compared to ensemble methods. Overall, this study provides valuable insights into the performance of machine learning algorithms for molecular solubility prediction, highlighting the importance of algorithm selection in achieving accurate and efficient predictions in practical applications.Keywords: random forest, machine learning, comparison, feature extraction
Procedia PDF Downloads 426348 Polycystic Ovary Syndrome: Cervical Cytology Features and Its Association with Endometrial Cancer
Authors: Faezah Shekh Abdullah, Mohd. Azizuddin Mohd. Yussof, Komathy Thiagarajan, Hasnoorina Husin, Noor Azreena Abd Aziz
Abstract:
Polycystic ovary syndrome has been associated with multiple disorders such as endocrine disorder, metabolic syndrome, infertility, and endometrial cancer. Women with polycystic ovary syndrome (PCOS) are anticipated to develop three times more chances for endometrial cancer than women without PCOS. This study, therefore, was conducted to determine the association between polycystic ovary syndrome and endometrial cancer and to determine the cervical cytology features of PCOS. Patients attending the Subfertility Clinic of the National Population and Family Development Board were recruited and examined physically by medical practitioners. They were categorized into two groups; i) the PCOS group if they met Rotterdam Criteria 2004 and ii) the control group if they did not meet Rotterdam Criteria 2004. Cervical sampling was done on all patients via the Liquid-Based Cytology (LBC) method in the pre-and post-subfertility treatment. A total of 167 patients participated in the study, of which 79 belonged to the PCOS group and 88 to the control group. The findings showed no cervical and endometrial cancer cases in both groups. The Liquid-Based Cytology results in the PCOS group displayed more cases with cellular changes, i.e., benign inflammation, atrophic smear and Candida sp. infection. To conclude, no association was found between polycystic ovary syndrome and endometrial cancer. A more holistic study with a higher number of participants can further determine the association between endometrial cancer and PCOS. Furthermore, a longer duration between LBC pre- and post-subfertility treatment should be implied to observe changes in the cervical cells.Keywords: endometrial cancer, liquid-based cytology, PCOS, polycystic ovary syndrome
Procedia PDF Downloads 1446347 On the Bias and Predictability of Asylum Cases
Authors: Panagiota Katsikouli, William Hamilton Byrne, Thomas Gammeltoft-Hansen, Tijs Slaats
Abstract:
An individual who demonstrates a well-founded fear of persecution or faces real risk of being subjected to torture is eligible for asylum. In Danish law, the exact legal thresholds reflect those established by international conventions, notably the 1951 Refugee Convention and the 1950 European Convention for Human Rights. These international treaties, however, remain largely silent when it comes to how states should assess asylum claims. As a result, national authorities are typically left to determine an individual’s legal eligibility on a narrow basis consisting of an oral testimony, which may itself be hampered by several factors, including imprecise language interpretation, insecurity or lacking trust towards the authorities among applicants. The leaky ground, on which authorities must assess their subjective perceptions of asylum applicants' credibility, questions whether, in all cases, adjudicators make the correct decision. Moreover, the subjective element in these assessments raises questions on whether individual asylum cases could be afflicted by implicit biases or stereotyping amongst adjudicators. In fact, recent studies have uncovered significant correlations between decision outcomes and the experience and gender of the assigned judge, as well as correlations between asylum outcomes and entirely external events such as weather and political elections. In this study, we analyze a publicly available dataset containing approximately 8,000 summaries of asylum cases, initially rejected, and re-tried by the Refugee Appeals Board (RAB) in Denmark. First, we look for variations in the recognition rates, with regards to a number of applicants’ features: their country of origin/nationality, their identified gender, their identified religion, their ethnicity, whether torture was mentioned in their case and if so, whether it was supported or not, and the year the applicant entered Denmark. In order to extract those features from the text summaries, as well as the final decision of the RAB, we applied natural language processing and regular expressions, adjusting for the Danish language. We observed interesting variations in recognition rates related to the applicants’ country of origin, ethnicity, year of entry and the support or not of torture claims, whenever those were made in the case. The appearance (or not) of significant variations in the recognition rates, does not necessarily imply (or not) bias in the decision-making progress. None of the considered features, with the exception maybe of the torture claims, should be decisive factors for an asylum seeker’s fate. We therefore investigate whether the decision can be predicted on the basis of these features, and consequently, whether biases are likely to exist in the decisionmaking progress. We employed a number of machine learning classifiers, and found that when using the applicant’s country of origin, religion, ethnicity and year of entry with a random forest classifier, or a decision tree, the prediction accuracy is as high as 82% and 85% respectively. tentially predictive properties with regards to the outcome of an asylum case. Our analysis and findings call for further investigation on the predictability of the outcome, on a larger dataset of 17,000 cases, which is undergoing.Keywords: asylum adjudications, automated decision-making, machine learning, text mining
Procedia PDF Downloads 976346 Blood Flow Simulations to Understand the Role of the Distal Vascular Branches of Carotid Artery in the Stroke Prediction
Authors: Muhsin Kizhisseri, Jorg Schluter, Saleh Gharie
Abstract:
Atherosclerosis is the main reason of stroke, which is one of the deadliest diseases in the world. The carotid artery in the brain is the prominent location for atherosclerotic progression, which hinders the blood flow into the brain. The inclusion of computational fluid dynamics (CFD) into the diagnosis cycle to understand the hemodynamics of the patient-specific carotid artery can give insights into stroke prediction. Realistic outlet boundary conditions are an inevitable part of the numerical simulations, which is one of the major factors in determining the accuracy of the CFD results. The Windkessel model-based outlet boundary conditions can give more realistic characteristics of the distal vascular branches of the carotid artery, such as the resistance to the blood flow and compliance of the distal arterial walls. This study aims to find the most influential distal branches of the carotid artery by using the Windkessel model parameters in the outlet boundary conditions. The parametric study approach to Windkessel model parameters can include the geometrical features of the distal branches, such as radius and length. The incorporation of the variations of the geometrical features of the major distal branches such as the middle cerebral artery, anterior cerebral artery, and ophthalmic artery through the Windkessel model can aid in identifying the most influential distal branch in the carotid artery. The results from this study can help physicians and stroke neurologists to have a more detailed and accurate judgment of the patient's condition.Keywords: stroke, carotid artery, computational fluid dynamics, patient-specific, Windkessel model, distal vascular branches
Procedia PDF Downloads 2166345 Statistical Modelling of Maximum Temperature in Rwanda Using Extreme Value Analysis
Authors: Emmanuel Iyamuremye, Edouard Singirankabo, Alexis Habineza, Yunvirusaba Nelson
Abstract:
Temperature is one of the most important climatic factors for crop production. However, severe temperatures cause drought, feverish and cold spells that have various consequences for human life, agriculture, and the environment in general. It is necessary to provide reliable information related to the incidents and the probability of such extreme events occurring. In the 21st century, the world faces a huge number of threats, especially from climate change, due to global warming and environmental degradation. The rise in temperature has a direct effect on the decrease in rainfall. This has an impact on crop growth and development, which in turn decreases crop yield and quality. Countries that are heavily dependent on agriculture use to suffer a lot and need to take preventive steps to overcome these challenges. The main objective of this study is to model the statistical behaviour of extreme maximum temperature values in Rwanda. To achieve such an objective, the daily temperature data spanned the period from January 2000 to December 2017 recorded at nine weather stations collected from the Rwanda Meteorological Agency were used. The two methods, namely the block maxima (BM) method and the Peaks Over Threshold (POT), were applied to model and analyse extreme temperature. Model parameters were estimated, while the extreme temperature return periods and confidence intervals were predicted. The model fit suggests Gumbel and Beta distributions to be the most appropriate models for the annual maximum of daily temperature. The results show that the temperature will continue to increase, as shown by estimated return levels.Keywords: climate change, global warming, extreme value theory, rwanda, temperature, generalised extreme value distribution, generalised pareto distribution
Procedia PDF Downloads 1856344 Biophysical Features of Glioma-Derived Extracellular Vesicles as Potential Diagnostic Markers
Authors: Abhimanyu Thakur, Youngjin Lee
Abstract:
Glioma is a lethal brain cancer whose early diagnosis and prognosis are limited due to the dearth of a suitable technique for its early detection. Current approaches, including magnetic resonance imaging (MRI), computed tomography (CT), and invasive biopsy for the diagnosis of this lethal disease, hold several limitations, demanding an alternative method. Recently, extracellular vesicles (EVs) have been used in numerous biomarker studies, majorly exosomes and microvesicles (MVs), which are found in most of the cells and biofluids, including blood, cerebrospinal fluid (CSF), and urine. Remarkably, glioma cells (GMs) release a high number of EVs, which are found to cross the blood-brain-barrier (BBB) and impersonate the constituents of parent GMs including protein, and lncRNA; however, biophysical properties of EVs have not been explored yet as a biomarker for glioma. We isolated EVs from cell culture conditioned medium of GMs and regular primary culture, blood, and urine of wild-type (WT)- and glioma mouse models, and characterized by nano tracking analyzer, transmission electron microscopy, immunogold-EM, and differential light scanning. Next, we measured the biophysical parameters of GMs-EVs by using atomic force microscopy. Further, the functional constituents of EVs were examined by FTIR and Raman spectroscopy. Exosomes and MVs-derived from GMs, blood, and urine showed distinction biophysical parameters (roughness, adhesion force, and stiffness) and different from that of regular primary glial cells, WT-blood, and -urine, which can be attributed to the characteristic functional constituents. Therefore, biophysical features can be potential diagnostic biomarkers for glioma.Keywords: glioma, extracellular vesicles, exosomes, microvesicles, biophysical properties
Procedia PDF Downloads 143