Search results for: tiger representation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1264

Search results for: tiger representation

394 Effects of Watershed Erosion on Stream Channel Formation

Authors: Tiao Chang, Ivan Caballero, Hong Zhou

Abstract:

Streams carry water and sediment naturally by maintaining channel dimensions, pattern, and profile over time. Watershed erosion as a natural process has occurred to contribute sediment to streams over time. The formation of channel dimensions is complex. This study is to relate quantifiable and consistent channel dimensions at the bankfull stage to the corresponding watershed erosion estimation by the Revised Universal Soil Loss Equation (RUSLE). Twelve sites of which drainage areas range from 7 to 100 square miles in the Hocking River Basin of Ohio were selected for the bankfull geometry determinations including width, depth, cross-section area, bed slope, and drainage area. The twelve sub-watersheds were chosen to obtain a good overall representation of the Hocking River Basin. It is of interest to determine how these bankfull channel dimensions are related to the soil erosion of corresponding sub-watersheds. Soil erosion is a natural process that has occurred in a watershed over time. The RUSLE was applied to estimate erosions of the twelve selected sub-watersheds where the bankfull geometry measurements were conducted. These quantified erosions of sub-watersheds are used to investigate correlations with bankfull channel dimensions including discharge, channel width, channel depth, cross-sectional area, and pebble distribution. It is found that drainage area, bankfull discharge and cross-sectional area correlates strongly with watershed erosion well. Furthermore, bankfull width and depth are moderately correlated with watershed erosion while the particle size, D50, of channel bed sediment is not well correlated with watershed erosion.

Keywords: watershed, stream, sediment, channel

Procedia PDF Downloads 285
393 Development of Medical Intelligent Process Model Using Ontology Based Technique

Authors: Emmanuel Chibuogu Asogwa, Tochukwu Sunday Belonwu

Abstract:

An urgent demand for creative solutions has been created by the rapid expansion of medical knowledge, the complexity of patient care, and the requirement for more precise decision-making. As a solution to this problem, the creation of a Medical Intelligent Process Model (MIPM) utilizing ontology-based appears as a promising way to overcome this obstacle and unleash the full potential of healthcare systems. The development of a Medical Intelligent Process Model (MIPM) using ontology-based techniques is motivated by a lack of quick access to relevant medical information and advanced tools for treatment planning and clinical decision-making, which ontology-based techniques can provide. The aim of this work is to develop a structured and knowledge-driven framework that leverages ontology, a formal representation of domain knowledge, to enhance various aspects of healthcare. Object-Oriented Analysis and Design Methodology (OOADM) were adopted in the design of the system as we desired to build a usable and evolvable application. For effective implementation of this work, we used the following materials/methods/tools: the medical dataset for the test of our model in this work was obtained from Kaggle. The ontology-based technique was used with Confusion Matrix, MySQL, Python, Hypertext Markup Language (HTML), Hypertext Preprocessor (PHP), Cascaded Style Sheet (CSS), JavaScript, Dreamweaver, and Fireworks. According to test results on the new system using Confusion Matrix, both the accuracy and overall effectiveness of the medical intelligent process significantly improved by 20% compared to the previous system. Therefore, using the model is recommended for healthcare professionals.

Keywords: ontology-based, model, database, OOADM, healthcare

Procedia PDF Downloads 77
392 Fake News Detection Based on Fusion of Domain Knowledge and Expert Knowledge

Authors: Yulan Wu

Abstract:

The spread of fake news on social media has posed significant societal harm to the public and the nation, with its threats spanning various domains, including politics, economics, health, and more. News on social media often covers multiple domains, and existing models studied by researchers and relevant organizations often perform well on datasets from a single domain. However, when these methods are applied to social platforms with news spanning multiple domains, their performance significantly deteriorates. Existing research has attempted to enhance the detection performance of multi-domain datasets by adding single-domain labels to the data. However, these methods overlook the fact that a news article typically belongs to multiple domains, leading to the loss of domain knowledge information contained within the news text. To address this issue, research has found that news records in different domains often use different vocabularies to describe their content. In this paper, we propose a fake news detection framework that combines domain knowledge and expert knowledge. Firstly, it utilizes an unsupervised domain discovery module to generate a low-dimensional vector for each news article, representing domain embeddings, which can retain multi-domain knowledge of the news content. Then, a feature extraction module uses the domain embeddings discovered through unsupervised domain knowledge to guide multiple experts in extracting news knowledge for the total feature representation. Finally, a classifier is used to determine whether the news is fake or not. Experiments show that this approach can improve multi-domain fake news detection performance while reducing the cost of manually labeling domain labels.

Keywords: fake news, deep learning, natural language processing, multiple domains

Procedia PDF Downloads 72
391 Tractography Analysis and the Evolutionary Origin of Schizophrenia

Authors: Mouktafi Amine, Tahiri Asmaa

Abstract:

A substantial number of traditional medical research has been put forward to managing and treating mental disorders. At the present time, to our best knowledge, it is believed that a fundamental understanding of the underlying causes of the majority of psychological disorders needs to be explored further to inform early diagnosis, managing symptoms and treatment. The emerging field of evolutionary psychology is a promising prospect to address the origin of mental disorders, potentially leading to more effective treatments. Schizophrenia as a topical mental disorder has been linked to the evolutionary adaptation of the human brain represented in the brain connectivity and asymmetry directly linked to humans' higher brain cognition in contrast to other primates being our direct living representation of the structure and connectivity of our earliest common African ancestors. As proposed in the evolutionary psychology scientific literature, the pathophysiology of schizophrenia is expressed and directly linked to altered connectivity between the Hippocampal Formation (HF) and Dorsolateral Prefrontal Cortex (DLPFC). This research paper presents the results of the use of tractography analysis using multiple open access Diffusion Weighted Imaging (DWI) datasets of healthy subjects, schizophrenia-affected subjects and primates to illustrate the relevance of the aforementioned brain regions' connectivity and the underlying evolutionary changes in the human brain. Deterministic fiber tracking and streamline analysis were used to generate connectivity matrices from the DWI datasets overlaid to compute distances and highlight disconnectivity patterns in conjunction with other fiber tracking metrics: Fractional Anisotropy (FA), Mean Diffusivity (MD) and Radial Diffusivity (RD).

Keywords: tractography, diffusion weighted imaging, schizophrenia, evolutionary psychology

Procedia PDF Downloads 48
390 Simulating Human Behavior in (Un)Built Environments: Using an Actor Profiling Method

Authors: Hadas Sopher, Davide Schaumann, Yehuda E. Kalay

Abstract:

This paper addresses the shortcomings of architectural computation tools in representing human behavior in built environments, prior to construction and occupancy of those environments. Evaluating whether a design fits the needs of its future users is currently done solely post construction, or is based on the knowledge and intuition of the designer. This issue is of high importance when designing complex buildings such as hospitals, where the quality of treatment as well as patient and staff satisfaction are of major concern. Existing computational pre-occupancy human behavior evaluation methods are geared mainly to test ergonomic issues, such as wheelchair accessibility, emergency egress, etc. As such, they rely on Agent Based Modeling (ABM) techniques, which emphasize the individual user. Yet we know that most human activities are social, and involve a number of actors working together, which ABM methods cannot handle. Therefore, we present an event-based model that manages the interaction between multiple Actors, Spaces, and Activities, to describe dynamically how people use spaces. This approach requires expanding the computational representation of Actors beyond their physical description, to include psychological, social, cultural, and other parameters. The model presented in this paper includes cognitive abilities and rules that describe the response of actors to their physical and social surroundings, based on the actors’ internal status. The model has been applied in a simulation of hospital wards, and showed adaptability to a wide variety of situated behaviors and interactions.

Keywords: agent based modeling, architectural design evaluation, event modeling, human behavior simulation, spatial cognition

Procedia PDF Downloads 262
389 Self-Supervised Attributed Graph Clustering with Dual Contrastive Loss Constraints

Authors: Lijuan Zhou, Mengqi Wu, Changyong Niu

Abstract:

Attributed graph clustering can utilize the graph topology and node attributes to uncover hidden community structures and patterns in complex networks, aiding in the understanding and analysis of complex systems. Utilizing contrastive learning for attributed graph clustering can effectively exploit meaningful implicit relationships between data. However, existing attributed graph clustering methods based on contrastive learning suffer from the following drawbacks: 1) Complex data augmentation increases computational cost, and inappropriate data augmentation may lead to semantic drift. 2) The selection of positive and negative samples neglects the intrinsic cluster structure learned from graph topology and node attributes. Therefore, this paper proposes a method called self-supervised Attributed Graph Clustering with Dual Contrastive Loss constraints (AGC-DCL). Firstly, Siamese Multilayer Perceptron (MLP) encoders are employed to generate two views separately to avoid complex data augmentation. Secondly, the neighborhood contrastive loss is introduced to constrain node representation using local topological structure while effectively embedding attribute information through attribute reconstruction. Additionally, clustering-oriented contrastive loss is applied to fully utilize clustering information in global semantics for discriminative node representations, regarding the cluster centers from two views as negative samples to fully leverage effective clustering information from different views. Comparative clustering results with existing attributed graph clustering algorithms on six datasets demonstrate the superiority of the proposed method.

Keywords: attributed graph clustering, contrastive learning, clustering-oriented, self-supervised learning

Procedia PDF Downloads 51
388 Author Profiling: Prediction of Learners’ Gender on a MOOC Platform Based on Learners’ Comments

Authors: Tahani Aljohani, Jialin Yu, Alexandra. I. Cristea

Abstract:

The more an educational system knows about a learner, the more personalised interaction it can provide, which leads to better learning. However, asking a learner directly is potentially disruptive, and often ignored by learners. Especially in the booming realm of MOOC Massive Online Learning platforms, only a very low percentage of users disclose demographic information about themselves. Thus, in this paper, we aim to predict learners’ demographic characteristics, by proposing an approach using linguistically motivated Deep Learning Architectures for Learner Profiling, particularly targeting gender prediction on a FutureLearn MOOC platform. Additionally, we tackle here the difficult problem of predicting the gender of learners based on their comments only – which are often available across MOOCs. The most common current approaches to text classification use the Long Short-Term Memory (LSTM) model, considering sentences as sequences. However, human language also has structures. In this research, rather than considering sentences as plain sequences, we hypothesise that higher semantic - and syntactic level sentence processing based on linguistics will render a richer representation. We thus evaluate, the traditional LSTM versus other bleeding edge models, which take into account syntactic structure, such as tree-structured LSTM, Stack-augmented Parser-Interpreter Neural Network (SPINN) and the Structure-Aware Tag Augmented model (SATA). Additionally, we explore using different word-level encoding functions. We have implemented these methods on Our MOOC dataset, which is the most performant one comparing with a public dataset on sentiment analysis that is further used as a cross-examining for the models' results.

Keywords: deep learning, data mining, gender predication, MOOCs

Procedia PDF Downloads 147
387 Problem Solving in Chilean Higher Education: Figurations Prior in Interpretations of Cartesian Graphs

Authors: Verónica Díaz

Abstract:

A Cartesian graph, as a mathematical object, becomes a tool for configuration of change. Its best comprehension is done through everyday life problem-solving associated with its representation. Despite this, the current educational framework favors general graphs, without consideration of their argumentation. Students are required to find the mathematical function without associating it to the development of graphical language. This research describes the use made by students of configurations made prior to Cartesian graphs with regards to an everyday life problem related to a time and distance variation phenomenon. The theoretical framework describes the function conditions of study and their modeling. This is a qualitative, descriptive study involving six undergraduate case studies that were carried out during the first term in 2016 at University of Los Lagos. The research problem concerned the graphic modeling of a real person’s movement phenomenon, and two levels of analysis were identified. The first level aims to identify local and global graph interpretations; a second level describes the iconicity and referentiality degree of an image. According to the results, students were able to draw no figures before the Cartesian graph, highlighting the need for students to represent the context and the movement of which causes the phenomenon change. From this, they managed Cartesian graphs representing changes in position, therefore, achieved an overall view of the graph. However, the local view only indicates specific events in the problem situation, using graphic and verbal expressions to represent movement. This view does not enable us to identify what happens on the graph when the movement characteristics change based on possible paths in the person’s walking speed.

Keywords: cartesian graphs, higher education, movement modeling, problem solving

Procedia PDF Downloads 217
386 Household Knowledge, Attitude, and Determinants in Solid Waste Segregation: The Case of Sfax City

Authors: Leila Kharrat, Younes Boujelbene

Abstract:

In recent decades, solid waste management (SWM) has become a global concern because rapid population growth and overexploitation of non-renewable resources have generated enormous amounts of waste far exceeding carrying capacity; too, it poses serious threats to the environment and health. However, it is still difficult to combat the growing amount of solid waste before assessing the condition of people. Therefore, this study was conducted to assess the knowledge, attitudes, perception, and practices on the separation of solid waste in Sfax City. Nowadays, GDS is essential for sustainable development, hence the need for intensive research. Respondents from seven different districts in the city of Sfax were analyzed through a questionnaire survey with 342 households. This paper presents a qualitative exploratory study on the behavior of the citizens in the field of waste separation. The objective knows the antecedents of waste separation and the representation that individuals have about sorting waste on a specific territory which presents some characteristics regarding waste management in Sfax city. Source separation is not widely practiced and people usually sweep their places throwing waste components into the streets or neighboring plots. The results also indicate that participation in solid waste separation activities depends on the level of awareness of separating activities in the area, household income and educational level. It is, therefore, argued that increasing quality of municipal service is the best means of promoting positive attitudes to solid waste separation activities. One of the effective strategies identified by households that can be initiated by policymakers to increase the rate of participation in separation activities and eventually encourage them to participate in recycling activities is to provide a financial incentive in all residential areas in Sfax city.

Keywords: solid waste management, waste separation, public policy, econometric modelling

Procedia PDF Downloads 234
385 Perceptual and Ultrasound Articulatory Training Effects on English L2 Vowels Production by Italian Learners

Authors: I. Sonia d’Apolito, Bianca Sisinni, Mirko Grimaldi, Barbara Gili Fivela

Abstract:

The American English contrast /ɑ-ʌ/ (cop-cup) is difficult to be produced by Italian learners since they realize L2-/ɑ-ʌ/ as L1-/ɔ-a/ respectively, due to differences in phonetic-phonological systems and also in grapheme-to-phoneme conversion rules. In this paper, we try to answer the following research questions: Can a short training improve the production of English /ɑ-ʌ/ by Italian learners? Is a perceptual training better than an articulatory (ultrasound - US) training? Thus, we compare a perceptual training with an US articulatory one to observe: 1) the effects of short trainings on L2-/ɑ-ʌ/ productions; 2) if the US articulatory training improves the pronunciation better than the perceptual training. In this pilot study, 9 Salento-Italian monolingual adults participated: 3 subjects performed a 1-hour perceptual training (ES-P); 3 subjects performed a 1-hour US training (ES-US); and 3 control subjects did not receive any training (CS). Verbal instructions about the phonetic properties of L2-/ɑ-ʌ/ and L1-/ɔ-a/ and their differences (representation on F1-F2 plane) were provided during both trainings. After these instructions, the ES-P group performed an identification training based on the High Variability Phonetic Training procedure, while the ES-US group performed the articulatory training, by means of US video of tongue gestures in L2-/ɑ-ʌ/ production and dynamic view of their own tongue movements and position using a probe under their chin. The acoustic data were analyzed and the first three formants were calculated. Independent t-tests were run to compare: 1) /ɑ-ʌ/ in pre- vs. post-test respectively; /ɑ-ʌ/ in pre- and post-test vs. L1-/a-ɔ/ respectively. Results show that in the pre-test all speakers realize L2-/ɑ-ʌ/ as L1-/ɔ-a/ respectively. Contrary to CS and ES-P groups, the ES-US group in the post-test differentiates the L2 vowels from those produced in the pre-test as well as from the L1 vowels, although only one ES-US subject produces both L2 vowels accurately. The articulatory training seems more effective than the perceptual one since it favors the production of vowels in the correct direction of L2 vowels and differently from the similar L1 vowels.

Keywords: L2 vowel production, perceptual training, articulatory training, ultrasound

Procedia PDF Downloads 255
384 Screening Diversity: Artificial Intelligence and Virtual Reality Strategies for Elevating Endangered African Languages in the Film and Television Industry

Authors: Samuel Ntsanwisi

Abstract:

This study investigates the transformative role of Artificial Intelligence (AI) and Virtual Reality (VR) in the preservation of endangered African languages. The study is contextualized within the film and television industry, highlighting disparities in screen representation for certain languages in South Africa, underscoring the need for increased visibility and preservation efforts; with globalization and cultural shifts posing significant threats to linguistic diversity, this research explores approaches to language preservation. By leveraging AI technologies, such as speech recognition, translation, and adaptive learning applications, and integrating VR for immersive and interactive experiences, the study aims to create a framework for teaching and passing on endangered African languages. Through digital documentation, interactive language learning applications, storytelling, and community engagement, the research demonstrates how these technologies can empower communities to revitalize their linguistic heritage. This study employs a dual-method approach, combining a rigorous literature review to analyse existing research on the convergence of AI, VR, and language preservation with primary data collection through interviews and surveys with ten filmmakers. The literature review establishes a solid foundation for understanding the current landscape, while interviews with filmmakers provide crucial real-world insights, enriching the study's depth. This balanced methodology ensures a comprehensive exploration of the intersection between AI, VR, and language preservation, offering both theoretical insights and practical perspectives from industry professionals.

Keywords: language preservation, endangered languages, artificial intelligence, virtual reality, interactive learning

Procedia PDF Downloads 59
383 Verification of Satellite and Observation Measurements to Build Solar Energy Projects in North Africa

Authors: Samy A. Khalil, U. Ali Rahoma

Abstract:

The measurements of solar radiation, satellite data has been routinely utilize to estimate solar energy. However, the temporal coverage of satellite data has some limits. The reanalysis, also known as "retrospective analysis" of the atmosphere's parameters, is produce by fusing the output of NWP (Numerical Weather Prediction) models with observation data from a variety of sources, including ground, and satellite, ship, and aircraft observation. The result is a comprehensive record of the parameters affecting weather and climate. The effectiveness of reanalysis datasets (ERA-5) for North Africa was evaluate against high-quality surfaces measured using statistical analysis. Estimating the distribution of global solar radiation (GSR) over five chosen areas in North Africa through ten-years during the period time from 2011 to 2020. To investigate seasonal change in dataset performance, a seasonal statistical analysis was conduct, which showed a considerable difference in mistakes throughout the year. By altering the temporal resolution of the data used for comparison, the performance of the dataset is alter. Better performance is indicate by the data's monthly mean values, but data accuracy is degraded. Solar resource assessment and power estimation are discuses using the ERA-5 solar radiation data. The average values of mean bias error (MBE), root mean square error (RMSE) and mean absolute error (MAE) of the reanalysis data of solar radiation vary from 0.079 to 0.222, 0.055 to 0.178, and 0.0145 to 0.198 respectively during the period time in the present research. The correlation coefficient (R2) varies from 0.93 to 99% during the period time in the present research. This research's objective is to provide a reliable representation of the world's solar radiation to aid in the use of solar energy in all sectors.

Keywords: solar energy, ERA-5 analysis data, global solar radiation, North Africa

Procedia PDF Downloads 96
382 Investigating Students' Understanding about Mathematical Concept through Concept Map

Authors: Rizky Oktaviana

Abstract:

The main purpose of studying lies in improving students’ understanding. Teachers usually use written test to measure students’ understanding about learning material especially mathematical learning material. This common method actually has a lack point, such that in mathematics content, written test only show procedural steps to solve mathematical problems. Therefore, teachers unable to see whether students actually understand about mathematical concepts and the relation between concepts or not. One of the best tools to observe students’ understanding about the mathematical concepts is concept map. The goal of this research is to describe junior high school students understanding about mathematical concepts through Concept Maps based on the difference of mathematical ability. There were three steps in this research; the first step was choosing the research subjects by giving mathematical ability test to students. The subjects of this research are three students with difference mathematical ability, high, intermediate and low mathematical ability. The second step was giving concept mapping training to the chosen subjects. The last step was giving concept mapping task about the function to the subjects. Nodes which are the representation of concepts of function were provided in concept mapping task. The subjects had to use the nodes in concept mapping. Based on data analysis, the result of this research shows that subject with high mathematical ability has formal understanding, due to that subject could see the connection between concepts of function and arranged the concepts become concept map with valid hierarchy. Subject with intermediate mathematical ability has relational understanding, because subject could arranged all the given concepts and gave appropriate label between concepts though it did not represent the connection specifically yet. Whereas subject with low mathematical ability has poor understanding about function, it can be seen from the concept map which is only used few of the given concepts because subject could not see the connection between concepts. All subjects have instrumental understanding for the relation between linear function concept, quadratic function concept and domain, co domain, range.

Keywords: concept map, concept mapping, mathematical concepts, understanding

Procedia PDF Downloads 270
381 An Integrated Label Propagation Network for Structural Condition Assessment

Authors: Qingsong Xiong, Cheng Yuan, Qingzhao Kong, Haibei Xiong

Abstract:

Deep-learning-driven approaches based on vibration responses have attracted larger attention in rapid structural condition assessment while obtaining sufficient measured training data with corresponding labels is relevantly costly and even inaccessible in practical engineering. This study proposes an integrated label propagation network for structural condition assessment, which is able to diffuse the labels from continuously-generating measurements by intact structure to those of missing labels of damage scenarios. The integrated network is embedded with damage-sensitive features extraction by deep autoencoder and pseudo-labels propagation by optimized fuzzy clustering, the architecture and mechanism which are elaborated. With a sophisticated network design and specified strategies for improving performance, the present network achieves to extends the superiority of self-supervised representation learning, unsupervised fuzzy clustering and supervised classification algorithms into an integration aiming at assessing damage conditions. Both numerical simulations and full-scale laboratory shaking table tests of a two-story building structure were conducted to validate its capability of detecting post-earthquake damage. The identifying accuracy of a present network was 0.95 in numerical validations and an average 0.86 in laboratory case studies, respectively. It should be noted that the whole training procedure of all involved models in the network stringently doesn’t rely upon any labeled data of damage scenarios but only several samples of intact structure, which indicates a significant superiority in model adaptability and feasible applicability in practice.

Keywords: autoencoder, condition assessment, fuzzy clustering, label propagation

Procedia PDF Downloads 93
380 Graph Neural Network-Based Classification for Disease Prediction in Health Care Heterogeneous Data Structures of Electronic Health Record

Authors: Raghavi C. Janaswamy

Abstract:

In the healthcare sector, heterogenous data elements such as patients, diagnosis, symptoms, conditions, observation text from physician notes, and prescriptions form the essentials of the Electronic Health Record (EHR). The data in the form of clear text and images are stored or processed in a relational format in most systems. However, the intrinsic structure restrictions and complex joins of relational databases limit the widespread utility. In this regard, the design and development of realistic mapping and deep connections as real-time objects offer unparallel advantages. Herein, a graph neural network-based classification of EHR data has been developed. The patient conditions have been predicted as a node classification task using a graph-based open source EHR data, Synthea Database, stored in Tigergraph. The Synthea DB dataset is leveraged due to its closer representation of the real-time data and being voluminous. The graph model is built from the EHR heterogeneous data using python modules, namely, pyTigerGraph to get nodes and edges from the Tigergraph database, PyTorch to tensorize the nodes and edges, PyTorch-Geometric (PyG) to train the Graph Neural Network (GNN) and adopt the self-supervised learning techniques with the AutoEncoders to generate the node embeddings and eventually perform the node classifications using the node embeddings. The model predicts patient conditions ranging from common to rare situations. The outcome is deemed to open up opportunities for data querying toward better predictions and accuracy.

Keywords: electronic health record, graph neural network, heterogeneous data, prediction

Procedia PDF Downloads 85
379 Bayesian System and Copula for Event Detection and Summarization of Soccer Videos

Authors: Dhanuja S. Patil, Sanjay B. Waykar

Abstract:

Event detection is a standout amongst the most key parts for distinctive sorts of area applications of video data framework. Recently, it has picked up an extensive interest of experts and in scholastics from different zones. While detecting video event has been the subject of broad study efforts recently, impressively less existing methodology has considered multi-model data and issues related efficiency. Start of soccer matches different doubtful circumstances rise that can't be effectively judged by the referee committee. A framework that checks objectively image arrangements would prevent not right interpretations because of some errors, or high velocity of the events. Bayesian networks give a structure for dealing with this vulnerability using an essential graphical structure likewise the probability analytics. We propose an efficient structure for analysing and summarization of soccer videos utilizing object-based features. The proposed work utilizes the t-cherry junction tree, an exceptionally recent advancement in probabilistic graphical models, to create a compact representation and great approximation intractable model for client’s relationships in an interpersonal organization. There are various advantages in this approach firstly; the t-cherry gives best approximation by means of junction trees class. Secondly, to construct a t-cherry junction tree can be to a great extent parallelized; and at last inference can be performed utilizing distributed computation. Examination results demonstrates the effectiveness, adequacy, and the strength of the proposed work which is shown over a far reaching information set, comprising more soccer feature, caught at better places.

Keywords: summarization, detection, Bayesian network, t-cherry tree

Procedia PDF Downloads 323
378 Effect of Naphtha in Addition to a Cycle Steam Stimulation Process Reducing the Heavy Oil Viscosity Using a Two-Level Factorial Design

Authors: Nora A. Guerrero, Adan Leon, María I. Sandoval, Romel Perez, Samuel Munoz

Abstract:

The addition of solvents in cyclic steam stimulation is a technique that has shown an impact on the improved recovery of heavy oils. In this technique, it is possible to reduce the steam/oil ratio in the last stages of the process, at which time this ratio increases significantly. The mobility of improved crude oil increases due to the structural changes of its components, which at the same time reflected in the decrease in density and viscosity. In the present work, the effect of the variables such as temperature, time, and weight percentage of naphtha was evaluated, using a factorial design of experiments 23. From the results of analysis of variance (ANOVA) and Pareto diagram, it was possible to identify the effect on viscosity reduction. The experimental representation of the crude-vapor-naphtha interaction was carried out in a batch reactor on a Colombian heavy oil of 12.8° API and 3500 cP. The conditions of temperature, reaction time, and percentage of naphtha were 270-300 °C, 48-66 hours, and 3-9% by weight, respectively. The results showed a decrease in density with values in the range of 0.9542 to 0.9414 g/cm³, while the viscosity decrease was in the order of 55 to 70%. On the other hand, simulated distillation results, according to ASTM 7169, revealed significant conversions of the 315°C+ fraction. From the spectroscopic techniques of nuclear magnetic resonance NMR, infrared FTIR and UV-VIS visible ultraviolet, it was determined that the increase in the performance of the light fractions in the improved crude is due to the breakdown of alkyl chains. The methodology for cyclic steam injection with naphtha and laboratory-scale characterization can be considered as a practical tool in improved recovery processes.

Keywords: viscosity reduction, cyclic steam stimulation, factorial design, naphtha

Procedia PDF Downloads 173
377 Planning Urban Sprawl in Mining Areas in Africa: How to Ensure Coherent Development

Authors: Pascal Rey, Anaïs Weber

Abstract:

Many mining projects are being developed in Africa the last decades. Due to the economic opportunities they offer, these projects result in a massive and rapid influx of migrants to the surrounding area. In areas where central government representation is low and local administration lack financial resources, urban development is often anarchical, beyond all public control. It leads to socio-spatial segregation, insecurity and the risk of social conflicts rising. Aware that their economic development is very correlated with local situation, mining companies get more and more involved in regional planning in setting up tools and Strategic Directions document. One of the commonly used tools in this regard is the “Influx Management Plan”. It consists in looking at the region’s absorption capacities in order to ensure its coherent development and by developing several urban centers than one macrocephalic city. It includes many other measures such as urban governance support, skills transfer, creation of strategic guidelines, financial support (local taxes, mining taxes, development funds etc.) local development projects. Through various examples of mining projects in Guinea, A country that is host to many large mining projects, we will look at the implications of regional and urban planning of which mining companies are key playor as well as public authorities. While their investment capacity offers advantages and accelerates development, their actions raise questions of the unilaterality of interests and local governance. By interfering in public affairs are mining companies not increasing the risk of central and local government shirking their responsibilities in terms of regional development, or even calling their legitimacy into question? Is such public-private collaboration really sustainable for the region as a whole and for all stakeholders?

Keywords: Africa, guinea, mine, urban planning

Procedia PDF Downloads 496
376 An Exploratory Sequential Design: A Mixed Methods Model for the Statistics Learning Assessment with a Bayesian Network Representation

Authors: Zhidong Zhang

Abstract:

This study established a mixed method model in assessing statistics learning with Bayesian network models. There are three variants in exploratory sequential designs. There are three linked steps in one of the designs: qualitative data collection and analysis, quantitative measure, instrument, intervention, and quantitative data collection analysis. The study used a scoring model of analysis of variance (ANOVA) as a content domain. The research study is to examine students’ learning in both semantic and performance aspects at fine grain level. The ANOVA score model, y = α+ βx1 + γx1+ ε, as a cognitive task to collect data during the student learning process. When the learning processes were decomposed into multiple steps in both semantic and performance aspects, a hierarchical Bayesian network was established. This is a theory-driven process. The hierarchical structure was gained based on qualitative cognitive analysis. The data from students’ ANOVA score model learning was used to give evidence to the hierarchical Bayesian network model from the evidential variables. Finally, the assessment results of students’ ANOVA score model learning were reported. Briefly, this was a mixed method research design applied to statistics learning assessment. The mixed methods designs expanded more possibilities for researchers to establish advanced quantitative models initially with a theory-driven qualitative mode.

Keywords: exploratory sequential design, ANOVA score model, Bayesian network model, mixed methods research design, cognitive analysis

Procedia PDF Downloads 177
375 Classification on Statistical Distributions of a Complex N-Body System

Authors: David C. Ni

Abstract:

Contemporary models for N-body systems are based on temporal, two-body, and mass point representation of Newtonian mechanics. Other mainstream models include 2D and 3D Ising models based on local neighborhood the lattice structures. In Quantum mechanics, the theories of collective modes are for superconductivity and for the long-range quantum entanglement. However, these models are still mainly for the specific phenomena with a set of designated parameters. We are therefore motivated to develop a new construction directly from the complex-variable N-body systems based on the extended Blaschke functions (EBF), which represent a non-temporal and nonlinear extension of Lorentz transformation on the complex plane – the normalized momentum spaces. A point on the complex plane represents a normalized state of particle momentums observed from a reference frame in the theory of special relativity. There are only two key parameters, normalized momentum and nonlinearity for modelling. An algorithm similar to Jenkins-Traub method is adopted for solving EBF iteratively. Through iteration, the solution sets show a form of σ + i [-t, t], where σ and t are the real numbers, and the [-t, t] shows various distributions, such as 1-peak, 2-peak, and 3-peak etc. distributions and some of them are analog to the canonical distributions. The results of the numerical analysis demonstrate continuum-to-discreteness transitions, evolutional invariance of distributions, phase transitions with conjugate symmetry, etc., which manifest the construction as a potential candidate for the unification of statistics. We hereby classify the observed distributions on the finite convergent domains. Continuous and discrete distributions both exist and are predictable for given partitions in different regions of parameter-pair. We further compare these distributions with canonical distributions and address the impacts on the existing applications.

Keywords: blaschke, lorentz transformation, complex variables, continuous, discrete, canonical, classification

Procedia PDF Downloads 309
374 Comparison of E-learning and Face-to-Face Learning Models Through the Early Design Stage in Architectural Design Education

Authors: Gülay Dalgıç, Gildis Tachir

Abstract:

Architectural design studios are ambiencein where architecture design is realized as a palpable product in architectural education. In the design studios that the architect candidate will use in the design processthe information, the methods of approaching the design problem, the solution proposals, etc., are set uptogetherwith the studio coordinators. The architectural design process, on the other hand, is complex and uncertain.Candidate architects work in a process that starts with abstre and ill-defined problems. This process starts with the generation of alternative solutions with the help of representation tools, continues with the selection of the appropriate/satisfactory solution from these alternatives, and then ends with the creation of an acceptable design/result product. In the studio ambience, many designs and thought relationships are evaluated, the most important step is the early design phase. In the early design phase, the first steps of converting the information are taken, and converted information is used in the constitution of the first design decisions. This phase, which positively affects the progress of the design process and constitution of the final product, is complex and fuzzy than the other phases of the design process. In this context, the aim of the study is to investigate the effects of face-to-face learning model and e-learning model on the early design phase. In the study, the early design phase was defined by literature research. The data of the defined early design phase criteria were obtained with the feedback graphics created for the architect candidates who performed e-learning in the first year of architectural education and continued their education with the face-to-face learning model. The findings of the data were analyzed with the common graphics program. It is thought that this research will contribute to the establishment of a contemporary architectural design education model by reflecting the evaluation of the data and results on architectural education.

Keywords: education modeling, architecture education, design education, design process

Procedia PDF Downloads 135
373 Alignment and Antagonism in Flux: A Diachronic Sentiment Analysis of Attitudes towards the Chinese Mainland in the Hong Kong Press

Authors: William Feng, Qingyu Gao

Abstract:

Despite the extensive discussions about Hong Kong’s sentiments towards the Chinese Mainland since the sovereignty transfer in 1997, there has been no large-scale empirical analysis of the changing attitudes in the mainstream media, which both reflect and shape sentiments in the society. To address this gap, the present study uses an optimised semantic-based automatic sentiment analysis method to examine a corpus of news about China from 1997 to 2020 in three main Chinese-language newspapers in Hong Kong, namely Apple Daily, Ming Pao, and Oriental Daily News. The analysis shows that although the Hong Kong press had a positive emotional tone toward China in general, the overall trend of sentiment was becoming increasingly negative. Meanwhile, the alignment and antagonism toward China have both increased, providing empirical evidence of attitudinal polarisation in the Hong Kong society. Specifically, Apple Daily’s depictions of China have become increasingly negative, though with some positive turns before 2008, whilst Oriental Daily News has consistently expressed more favourable sentiments. Ming Pao maintained an impartial stance toward China through an increased but balanced representation of positive and negative sentiments, with its subjectivity and sentiment intensity growing to an industry-standard level. The results provide new insights into the complexity of sentiments towards China in the Hong Kong press and media attitudes in general in terms of the “us” and “them” positioning by explicating the cross-newspaper and cross-period variations using an enhanced sentiment analysis method which incorporates sentiment-oriented and semantic role analysis techniques.

Keywords: media attitude, sentiment analysis, Hong Kong press, one country two systems

Procedia PDF Downloads 115
372 Khilafat from Khilafat-e-Rashida: The Rightly Guided the Only Form of Governance to Unite Muslim Countries

Authors: Zoaib Mirza

Abstract:

Half of the Muslim countries in the world have declared Islam the state religion in their constitutions. Yet, none of these countries have implemented authentic Islamic laws in line with the Quran (Holy Book), practices of Prophet Mohammad (P.B.U.H) called the Sunnah, and his four successors known as the Rightly Guided - Khalifa. Since their independence, these countries have adopted different government systems like Democracy, Dictatorship, Republic, Communism, and Monarchy. Instead of benefiting the people, these government systems have put these countries into political, social, and economic crises. These Islamic countries do not have equal representation and membership in worldwide political forums. Western countries lead these forums. Therefore, it is now imperative for the Muslim leaders of all these countries to collaborate, reset, and implement the original Islamic form of government, which led to the prosperity and success of people, including non-Muslims, 1400 years ago. They should unite as one nation under Khalifat, which means establishing the authority of Allah (SWT) and following the divine commandments related to the social, political, and economic systems. As they have declared Islam in their constitution, they should work together to apply the divine framework of the governance revealed by Allah (SWT) and implemented by Prophet Mohammad (P.B.U.H) and his four successors called Khalifas. This paper provides an overview of the downfall and the end of the Khalifat system by 1924, the ways in which the West caused political, social, and economic crises in the Muslim countries, and finally, a summary of the social, political, and economic systems implemented by the Prophet Mohammad (P.B.U.H) and his successors, Khalifas, called the Rightly Guided – Hazrat Abu Bakr (RA), Hazrat Omar (RA), Hazrat Usman (RA), and Hazrat Ali (RA).

Keywords: khalifat, khilafat-e-Rashida, the rightly guided, colonization, capitalism, neocolonization, government systems

Procedia PDF Downloads 118
371 Geospatial Curve Fitting Methods for Disease Mapping of Tuberculosis in Eastern Cape Province, South Africa

Authors: Davies Obaromi, Qin Yongsong, James Ndege

Abstract:

To interpolate scattered or regularly distributed data, there are imprecise or exact methods. However, there are some of these methods that could be used for interpolating data in a regular grid and others in an irregular grid. In spatial epidemiology, it is important to examine how a disease prevalence rates are distributed in space, and how they relate with each other within a defined distance and direction. In this study, for the geographic and graphic representation of the disease prevalence, linear and biharmonic spline methods were implemented in MATLAB, and used to identify, localize and compare for smoothing in the distribution patterns of tuberculosis (TB) in Eastern Cape Province. The aim of this study is to produce a more “smooth” graphical disease map for TB prevalence patterns by a 3-D curve fitting techniques, especially the biharmonic splines that can suppress noise easily, by seeking a least-squares fit rather than exact interpolation. The datasets are represented generally as a 3D or XYZ triplets, where X and Y are the spatial coordinates and Z is the variable of interest and in this case, TB counts in the province. This smoothing spline is a method of fitting a smooth curve to a set of noisy observations using a spline function, and it has also become the conventional method for its high precision, simplicity and flexibility. Surface and contour plots are produced for the TB prevalence at the provincial level for 2012 – 2015. From the results, the general outlook of all the fittings showed a systematic pattern in the distribution of TB cases in the province and this is consistent with some spatial statistical analyses carried out in the province. This new method is rarely used in disease mapping applications, but it has a superior advantage to be assessed at subjective locations rather than only on a rectangular grid as seen in most traditional GIS methods of geospatial analyses.

Keywords: linear, biharmonic splines, tuberculosis, South Africa

Procedia PDF Downloads 238
370 Fluid Structure Interaction Study between Ahead and Angled Impact of AGM 88 Missile Entering Relatively High Viscous Fluid for K-Omega Turbulence Model

Authors: Abu Afree Andalib, Rafiur Rahman, Md Mezbah Uddin

Abstract:

The main objective of this work is to anatomize on the various parameters of AGM 88 missile anatomized using FSI module in Ansys. Computational fluid dynamics is used for the study of fluid flow pattern and fluidic phenomenon such as drag, pressure force, energy dissipation and shockwave distribution in water. Using finite element analysis module of Ansys, structural parameters such as stress and stress density, localization point, deflection, force propagation is determined. Separate analysis on structural parameters is done on Abacus. State of the art coupling module is used for FSI analysis. Fine mesh is considered in every case for better result during simulation according to computational machine power. The result of the above-mentioned parameters is analyzed and compared for two phases using graphical representation. The result of Ansys and Abaqus are also showed. Computational Fluid Dynamics and Finite Element analyses and subsequently the Fluid-Structure Interaction (FSI) technique is being considered. Finite volume method and finite element method are being considered for modelling fluid flow and structural parameters analysis. Feasible boundary conditions are also utilized in the research. Significant change in the interaction and interference pattern while the impact was found. Theoretically as well as according to simulation angled condition was found with higher impact.

Keywords: FSI (Fluid Surface Interaction), impact, missile, high viscous fluid, CFD (Computational Fluid Dynamics), FEM (Finite Element Analysis), FVM (Finite Volume Method), fluid flow, fluid pattern, structural analysis, AGM-88, Ansys, Abaqus, meshing, k-omega, turbulence model

Procedia PDF Downloads 465
369 Automatic Near-Infrared Image Colorization Using Synthetic Images

Authors: Yoganathan Karthik, Guhanathan Poravi

Abstract:

Colorizing near-infrared (NIR) images poses unique challenges due to the absence of color information and the nuances in light absorption. In this paper, we present an approach to NIR image colorization utilizing a synthetic dataset generated from visible light images. Our method addresses two major challenges encountered in NIR image colorization: accurately colorizing objects with color variations and avoiding over/under saturation in dimly lit scenes. To tackle these challenges, we propose a Generative Adversarial Network (GAN)-based framework that learns to map NIR images to their corresponding colorized versions. The synthetic dataset ensures diverse color representations, enabling the model to effectively handle objects with varying hues and shades. Furthermore, the GAN architecture facilitates the generation of realistic colorizations while preserving the integrity of dimly lit scenes, thus mitigating issues related to over/under saturation. Experimental results on benchmark NIR image datasets demonstrate the efficacy of our approach in producing high-quality colorizations with improved color accuracy and naturalness. Quantitative evaluations and comparative studies validate the superiority of our method over existing techniques, showcasing its robustness and generalization capability across diverse NIR image scenarios. Our research not only contributes to advancing NIR image colorization but also underscores the importance of synthetic datasets and GANs in addressing domain-specific challenges in image processing tasks. The proposed framework holds promise for various applications in remote sensing, medical imaging, and surveillance where accurate color representation of NIR imagery is crucial for analysis and interpretation.

Keywords: computer vision, near-infrared images, automatic image colorization, generative adversarial networks, synthetic data

Procedia PDF Downloads 42
368 Comparison Approach for Wind Resource Assessment to Determine Most Precise Approach

Authors: Tasir Khan, Ishfaq Ahmad, Yejuan Wang, Muhammad Salam

Abstract:

Distribution models of the wind speed data are essential to assess the potential wind speed energy because it decreases the uncertainty to estimate wind energy output. Therefore, before performing a detailed potential energy analysis, the precise distribution model for data relating to wind speed must be found. In this research, material from numerous criteria goodness-of-fits, such as Kolmogorov Simonov, Anderson Darling statistics, Chi-Square, root mean square error (RMSE), AIC and BIC were combined finally to determine the wind speed of the best-fitted distribution. The suggested method collectively makes each criterion. This method was useful in a circumstance to fitting 14 distribution models statistically with the data of wind speed together at four sites in Pakistan. The consequences show that this method provides the best source for selecting the most suitable wind speed statistical distribution. Also, the graphical representation is consistent with the analytical results. This research presents three estimation methods that can be used to calculate the different distributions used to estimate the wind. In the suggested MLM, MOM, and MLE the third-order moment used in the wind energy formula is a key function because it makes an important contribution to the precise estimate of wind energy. In order to prove the presence of the suggested MOM, it was compared with well-known estimation methods, such as the method of linear moment, and maximum likelihood estimate. In the relative analysis, given to several goodness-of-fit, the presentation of the considered techniques is estimated on the actual wind speed evaluated in different time periods. The results obtained show that MOM certainly provides a more precise estimation than other familiar approaches in terms of estimating wind energy based on the fourteen distributions. Therefore, MOM can be used as a better technique for assessing wind energy.

Keywords: wind-speed modeling, goodness of fit, maximum likelihood method, linear moment

Procedia PDF Downloads 83
367 Logical-Probabilistic Modeling of the Reliability of Complex Systems

Authors: Sergo Tsiramua, Sulkhan Sulkhanishvili, Elisabed Asabashvili, Lazare Kvirtia

Abstract:

The paper presents logical-probabilistic methods, models, and algorithms for reliability assessment of complex systems, based on which a web application for structural analysis and reliability assessment of systems was created. It is important to design systems based on structural analysis, research, and evaluation of efficiency indicators. One of the important efficiency criteria is the reliability of the system, which depends on the components of the structure. Quantifying the reliability of large-scale systems is a computationally complex process, and it is advisable to perform it with the help of a computer. Logical-probabilistic modeling is one of the effective means of describing the structure of a complex system and quantitatively evaluating its reliability, which was the basis of our application. The reliability assessment process included the following stages, which were reflected in the application: 1) Construction of a graphical scheme of the structural reliability of the system; 2) Transformation of the graphic scheme into a logical representation and modeling of the shortest ways of successful functioning of the system; 3) Description of system operability condition with logical function in the form of disjunctive normal form (DNF); 4) Transformation of DNF into orthogonal disjunction normal form (ODNF) using the orthogonalization algorithm; 5) Replacing logical elements with probabilistic elements in ODNF, obtaining a reliability estimation polynomial and quantifying reliability; 6) Calculation of “weights” of elements of system. Using the logical-probabilistic methods, models and algorithms discussed in the paper, a special software was created, by means of which a quantitative assessment of the reliability of systems of a complex structure is produced. As a result, structural analysis of systems, research, and designing of optimal structure systems are carried out.

Keywords: complex systems, logical-probabilistic methods, orthogonalization algorithm, reliability of systems, “weights” of elements

Procedia PDF Downloads 65
366 Document-level Sentiment Analysis: An Exploratory Case Study of Low-resource Language Urdu

Authors: Ammarah Irum, Muhammad Ali Tahir

Abstract:

Document-level sentiment analysis in Urdu is a challenging Natural Language Processing (NLP) task due to the difficulty of working with lengthy texts in a language with constrained resources. Deep learning models, which are complex neural network architectures, are well-suited to text-based applications in addition to data formats like audio, image, and video. To investigate the potential of deep learning for Urdu sentiment analysis, we implemented five different deep learning models, including Bidirectional Long Short Term Memory (BiLSTM), Convolutional Neural Network (CNN), Convolutional Neural Network with Bidirectional Long Short Term Memory (CNN-BiLSTM), and Bidirectional Encoder Representation from Transformer (BERT). In this study, we developed a hybrid deep learning model called BiLSTM-Single Layer Multi Filter Convolutional Neural Network (BiLSTM-SLMFCNN) by fusing BiLSTM and CNN architecture. The proposed and baseline techniques are applied on Urdu Customer Support data set and IMDB Urdu movie review data set by using pre-trained Urdu word embedding that are suitable for sentiment analysis at the document level. Results of these techniques are evaluated and our proposed model outperforms all other deep learning techniques for Urdu sentiment analysis. BiLSTM-SLMFCNN outperformed the baseline deep learning models and achieved 83%, 79%, 83% and 94% accuracy on small, medium and large sized IMDB Urdu movie review data set and Urdu Customer Support data set respectively.

Keywords: urdu sentiment analysis, deep learning, natural language processing, opinion mining, low-resource language

Procedia PDF Downloads 70
365 Analysing the Interactive Effects of Factors Influencing Sand Production on Drawdown Time in High Viscosity Reservoirs

Authors: Gerald Gwamba, Bo Zhou, Yajun Song, Dong Changyin

Abstract:

The challenges that sand production presents to the oil and gas industry, particularly while working in poorly consolidated reservoirs, cannot be overstated. From restricting production to blocking production tubing, sand production increases the costs associated with production as it elevates the cost of servicing production equipment over time. Production in reservoirs that present with high viscosities, flow rate, cementation, clay content as well as fine sand contents is even more complex and challenging. As opposed to the one-factor at a-time testing, investigating the interactive effects arising from a combination of several factors offers increased reliability of results as well as representation of actual field conditions. It is thus paramount to investigate the conditions leading to the onset of sanding during production to ensure the future sustainability of hydrocarbon production operations under viscous conditions. We adopt the Design of Experiments (DOE) to analyse, using Taguchi factorial designs, the most significant interactive effects of sanding. We propose an optimized regression model to predict the drawdown time at sand production. The results obtained underscore that reservoirs characterized by varying (high and low) levels of viscosity, flow rate, cementation, clay, and fine sand content have a resulting impact on sand production. The only significant interactive effect recorded arises from the interaction between BD (fine sand content and flow rate), while the main effects included fluid viscosity and cementation, with percentage significances recorded as 31.3%, 37.76%, and 30.94%, respectively. The drawdown time model presented could be useful for predicting the time to reach the maximum drawdown pressure under viscous conditions during the onset of sand production.

Keywords: factorial designs, DOE optimization, sand production prediction, drawdown time, regression model

Procedia PDF Downloads 150