Search results for: modern methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17681

Search results for: modern methods

13391 Relative Effectiveness of Inquiry: Approach and Expository Instructional Methods in Fostering Students’ Retention in Chemistry

Authors: Joy Johnbest Egbo

Abstract:

The study was designed to investigate the relative effectiveness of inquiry role approach and expository instructional methods in fostering students’ retention in chemistry. Two research questions were answered and three null hypotheses were formulated and tested at 0.05 level of significance. A quasi-experimental (the non-equivalent pretest, posttest control group) design was adopted for the study. The population for the study comprised all senior secondary school class two (SS II) students who were offering Chemistry in single sex schools in Enugu Education Zone. The instrument for data collection was a self-developed Chemistry Retention Test (CRT). Relevant data were collected from a sample of one hundred and forty–one (141) students drawn from two secondary schools (1 male and 1 female schools) using simple random sampling technique. A reliability co-efficient of 0.82 was obtained for the instrument using Kuder Richardson formular20 (K-R20). Mean and Standard deviation scores were used to answer the research questions while two–way analysis of covariance (ANCOVA) was used to test the hypotheses. The findings showed that the students taught with Inquiry role approach retained the chemistry concept significantly higher than their counterparts taught with expository method. Female students retained slightly higher than their male counterparts. There is significant interaction between instructional packages and gender on Chemistry students’ retention. It was recommended, among others, that teachers should be encouraged to employ the use of Inquiry-role approach more in the teaching of chemistry and other subjects in general. By so doing, students’ retention of the subject could be increased.

Keywords: inquiry role approach, retention, exposition method, chemistry

Procedia PDF Downloads 516
13390 Antagonistic Potential of Epiphytic Bacteria Isolated in Kazakhstan against Erwinia amylovora, the Causal Agent of Fire Blight

Authors: Assel E. Molzhigitova, Amankeldi K. Sadanov, Elvira T. Ismailova, Kulyash A. Iskandarova, Olga N. Shemshura, Ainur I. Seitbattalova

Abstract:

Fire blight is a very harmful for commercial apple and pear production quarantine bacterial disease. To date, several different methods have been proposed for disease control, including the use of copperbased preparations and antibiotics, which are not always reliable or effective. The use of bacteria as biocontrol agents is one of the most promising and eco-friendly alternative methods. Bacteria with protective activity against the causal agent of fire blight are often present among the epiphytic microorganisms of the phyllosphere of host plants. Therefore, the main objective of our study was screening of local epiphytic bacteria as possible antagonists against Erwinia amylovora, the causal agent of fire blight. Samples of infected organs of apple and pear trees (shoots, leaves, fruits) were collected from the industrial horticulture areas in various agro-ecological zones of Kazakhstan. Epiphytic microorganisms were isolated by standard and modified methods on specific nutrient media. The primary screening of selected microorganisms under laboratory conditions to determine the ability to suppress the growth of Erwinia amylovora was performed by agar-diffusion-test. Among 142 bacteria isolated from the fire blight host plants, 5 isolates, belonging to the genera Bacillus, Lactobacillus, Pseudomonas, Paenibacillus and Pantoea showed higher antagonistic activity against the pathogen. The diameters of inhibition zone have been depended on the species and ranged from 10 mm to 48 mm. The maximum diameter of inhibition zone (48 mm) was exhibited by B. amyloliquefaciens. Less inhibitory effect was showed by Pantoea agglomerans PA1 (19 mm). The study of inhibitory effect of Lactobacillus species against E. amylovora showed that among 7 isolates tested only one (Lactobacillus plantarum 17M) demonstrated inhibitory zone (30 mm). In summary, this study was devoted to detect the beneficial epiphytic bacteria from plants organs of pear and apple trees due to fire blight control in Kazakhstan. Results obtained from the in vitro experiments showed that the most efficient bacterial isolates are Lactobacillus plantarum 17M, Bacillus amyloliquefaciens MB40, and Pantoea agglomerans PA1. These antagonists are suitable for development as biocontrol agents for fire blight control. Their efficacies will be evaluated additionally, in biological tests under in vitro and field conditions during our further study.

Keywords: antagonists, epiphytic bacteria, Erwinia amylovora, fire blight

Procedia PDF Downloads 177
13389 Determination of Klebsiella Pneumoniae Susceptibility to Antibiotics Using Infrared Spectroscopy and Machine Learning Algorithms

Authors: Manal Suleiman, George Abu-Aqil, Uraib Sharaha, Klaris Riesenberg, Itshak Lapidot, Ahmad Salman, Mahmoud Huleihel

Abstract:

Klebsiella pneumoniae is one of the most aggressive multidrug-resistant bacteria associated with human infections resulting in high mortality and morbidity. Thus, for an effective treatment, it is important to diagnose both the species of infecting bacteria and their susceptibility to antibiotics. Current used methods for diagnosing the bacterial susceptibility to antibiotics are time-consuming (about 24h following the first culture). Thus, there is a clear need for rapid methods to determine the bacterial susceptibility to antibiotics. Infrared spectroscopy is a well-known method that is known as sensitive and simple which is able to detect minor biomolecular changes in biological samples associated with developing abnormalities. The main goal of this study is to evaluate the potential of infrared spectroscopy in tandem with Random Forest and XGBoost machine learning algorithms to diagnose the susceptibility of Klebsiella pneumoniae to antibiotics within approximately 20 minutes following the first culture. In this study, 1190 Klebsiella pneumoniae isolates were obtained from different patients with urinary tract infections. The isolates were measured by the infrared spectrometer, and the spectra were analyzed by machine learning algorithms Random Forest and XGBoost to determine their susceptibility regarding nine specific antibiotics. Our results confirm that it was possible to classify the isolates into sensitive and resistant to specific antibiotics with a success rate range of 80%-85% for the different tested antibiotics. These results prove the promising potential of infrared spectroscopy as a powerful diagnostic method for determining the Klebsiella pneumoniae susceptibility to antibiotics.

Keywords: urinary tract infection (UTI), Klebsiella pneumoniae, bacterial susceptibility, infrared spectroscopy, machine learning

Procedia PDF Downloads 176
13388 A Relative Entropy Regularization Approach for Fuzzy C-Means Clustering Problem

Authors: Ouafa Amira, Jiangshe Zhang

Abstract:

Clustering is an unsupervised machine learning technique; its aim is to extract the data structures, in which similar data objects are grouped in the same cluster, whereas dissimilar objects are grouped in different clusters. Clustering methods are widely utilized in different fields, such as: image processing, computer vision , and pattern recognition, etc. Fuzzy c-means clustering (fcm) is one of the most well known fuzzy clustering methods. It is based on solving an optimization problem, in which a minimization of a given cost function has been studied. This minimization aims to decrease the dissimilarity inside clusters, where the dissimilarity here is measured by the distances between data objects and cluster centers. The degree of belonging of a data point in a cluster is measured by a membership function which is included in the interval [0, 1]. In fcm clustering, the membership degree is constrained with the condition that the sum of a data object’s memberships in all clusters must be equal to one. This constraint can cause several problems, specially when our data objects are included in a noisy space. Regularization approach took a part in fuzzy c-means clustering technique. This process introduces an additional information in order to solve an ill-posed optimization problem. In this study, we focus on regularization by relative entropy approach, where in our optimization problem we aim to minimize the dissimilarity inside clusters. Finding an appropriate membership degree to each data object is our objective, because an appropriate membership degree leads to an accurate clustering result. Our clustering results in synthetic data sets, gaussian based data sets, and real world data sets show that our proposed model achieves a good accuracy.

Keywords: clustering, fuzzy c-means, regularization, relative entropy

Procedia PDF Downloads 264
13387 Grammarly: Great Writings Get Work Done Using AI

Authors: Neha Intikhab Khan, Alanoud AlBalwi, Farah Alqazlan, Tala Almadoudi

Abstract:

Background: Grammarly, a widely utilized writing assistant launched in 2009, leverages advanced artificial intelligence and natural language processing to enhance writing quality across various platforms. Methods: To collect data on user perceptions of Grammarly, a structured survey was designed and distributed via Google Forms. The survey included a series of quantitative and qualitative questions aimed at assessing various aspects of Grammarly's performance. The survey comprised multiple-choice questions, Likert scale items (ranging from "strongly disagree" to "strongly agree"), and open-ended questions to capture detailed user feedback. The target population included students, friends, and family members. The collected responses were analyzed using statistical methods to quantify user satisfaction. Participation in the survey was voluntary, and respondents were assured anonymity and confidentiality. Results: The survey of 28 respondents revealed a generally favorable perception of Grammarly's AI capabilities. A significant 39.3% strongly agreed that it effectively improves text tone, with an additional 46.4% agreeing, while 10.7% remained neutral. For clarity suggestions, 28.6% strongly agreed, and 57.1% agreed, totaling 85.7% recognition of its value. Regarding grammatical accuracy across various genres, 46.4% rated it a perfect score of 5, contributing to 78.5% who found it highly effective. Conclusion: The evolution of Grammarly from a basic grammar checker to a robust AI-driven application underscores its adaptability and commitment to helping users develop their writing skills.

Keywords: Grammarly, writing tool, user engagement, AI capabilities, effectiveness

Procedia PDF Downloads 10
13386 Agency Beyond Metaphysics of Subjectivity

Authors: Erik Kuravsky

Abstract:

One of the problems with a post-structuralist account of agency is that it appears to reject the freedom of an acting subject, thus seeming to deny the very phenomenon of agency. However, this is only a problem if we think that human beings can be agents exclusively in terms of being subjects, that is, if we think agency subjectively. Indeed, we tend to understand traditional theories of human freedom (e.g., Plato’s or Kant’s) in terms of a peculiar ability of the subject. The paper suggests to de-subjectivize agency with the help of Heidegger’s later thought. To do it, ir argues that classical theories of agency may indeed be interpreted as subject-oriented (sometimes even by their authors), but do not have to be read as such. Namely, the claim is that what makes agency what it is, what is essential in agency, is not its belonginess to a subject, but its ontological configuration. We may say that agency “happens,” and that there is a very specific ontological characteristics to this happening. The argument of the paper is that we can find these characteristic in the classical accounts of agency and that these characteristics are sufficient to distinguish human freedom from other natural phenomena. In particular, it offers to think agency not as one of human characteristics, but as an ontological event in which human beings take part. Namely, agency is a (non-human) characteristic of the different modes in which the experienceable existence of beings is determined by Being. To be an agent then is to participate in such ontological determination. What enables this participation is the ways human beings non-thematically understand the ontological difference. For example, for Plato, one acts freely only if one is led by an idea of the good, while for Kant the imperative for free action is categorial. The agency of an agent is thus dependent on the differentiation between ideas/categories and beings met in experience – one is “free” from contingent sensibility in terms of what is different from it ontologically. In this light, modern dependence on subjectivity is evident in the fact that the ontological difference is thought as belonging to one’s thinking, consciousness etc. That is, it is taken subjectively. A non-subjective account of agency, on the other hand, requires thinking this difference as belonging to Being itself, and thinking human beings as a medium within which occurs the non-human force of ontological differentiation.

Keywords: Heidegger, freedom, agency, poststructuralism

Procedia PDF Downloads 200
13385 Synthesis of Human Factors Theories and Industry 4.0

Authors: Andrew Couch, Nicholas Loyd, Nathan Tenhundfeld

Abstract:

The rapid emergence of technology observably induces disruptive effects that carry implications for internal organizational dynamics as well as external market opportunities, strategic pressures, and threats. An examination of the historical tendencies of technology innovation shows that the body of managerial knowledge for addressing such disruption is underdeveloped. Fundamentally speaking, the impacts of innovation are unique and situationally oriented. Hence, the appropriate managerial response becomes a complex function that depends on the nature of the emerging technology, the posturing of internal organizational dynamics, the rate of technological growth, and much more. This research considers a particular case of mismanagement, the BP Texas City Refinery explosion of 2005, that carries notable discrepancies on the basis of human factors principles. Moreover, this research considers the modern technological climate (shaped by Industry 4.0 technologies) and seeks to arrive at an appropriate conceptual lens by which human factors principles and Industry 4.0 may be favorably integrated. In this manner, the careful examination of these phenomena helps to better support the sustainment of human factors principles despite the disruptive impacts that are imparted by technological innovation. In essence, human factors considerations are assessed through the application of principles that stem from usability engineering, the Swiss Cheese Model of accident causation, human-automation interaction, signal detection theory, alarm design, and other factors. Notably, this stream of research supports a broader framework in seeking to guide organizations amid the uncertainties of Industry 4.0 to capture higher levels of adoption, implementation, and transparency.

Keywords: Industry 4.0, human factors engineering, management, case study

Procedia PDF Downloads 73
13384 Westernization of Islamic Culture, A Historical Analysis

Authors: Saidalavi Kannattippadi

Abstract:

It is a culture based study on revealing how the indebtedness of the west belongs to the moral and scientific culture of Islam, even to such a way to be said there was no room for renaissance and the enlightment of the west without the active intervention of the Islamic culture in thoughts and activities of the European thinkers. The study focuses on the exact causes that led the west to the renaissance and goes through analyzing each of historical evidences for confirming the continuous cultural assimilations that occurred between east and west, through transmissions of knowledge, translations of unique treatises, study trips and so on. The west had deeply influenced by the thought and culture of Islam after having a long bitter experience from the blind rituals and customs introduced by the church and was expecting for a movement that can raise them upwards from the bankruptcy of morality and spirituality. The sequence of crusades and voyages of thinkers from west to eastern wards made the western people aware of the best culture ever found in the world as in name of Islam and they become ready to assimilate its notable cultural values and to borrow its cultural achievements. The west had two types of influences from the Islam; moral and scientific. the uprooting of untouchablitlity and racism from western society and their accepting the ideologies of equality and fraternity are moral influence and the innumerable inventions and discoveries found in modern science and technology are the scientific influences. Without the frantic efforts of Muslims in translating, modifying and commenting the science and philosophy of the Greek the west would not have even a chance to peep to the cultural values of the Greek. Here the Muslims are the guides and channels through which the west got educated and well cultured. The study also briefly sheds light on the cultural achievements of Muslims in material science, human science, etc.

Keywords: cultural assimilation, culture and civilization, indebtedness, Muslim world, west, translation, transmission

Procedia PDF Downloads 400
13383 Groupthink: The Dark Side of Team Cohesion

Authors: Farhad Eizakshiri

Abstract:

The potential for groupthink to explain the issues contributing to deterioration of decision-making ability within the unitary team and so to cause poor outcomes attracted a great deal of attention from a variety of disciplines, including psychology, social and organizational studies, political science, and others. Yet what remains unclear is how and why the team members’ strivings for unanimity and cohesion override their motivation to realistically appraise alternative courses of action. In this paper, the findings of a sequential explanatory mixed-methods research containing an experiment with thirty groups of three persons each and interviews with all experimental groups to investigate this issue is reported. The experiment sought to examine how individuals aggregate their views in order to reach a consensual group decision concerning the completion time of a task. The results indicated that groups made better estimates when they had no interaction between members in comparison with the situation that groups collectively agreed on time estimates. To understand the reasons, the qualitative data and informal observations collected during the task were analyzed through conversation analysis, thus leading to four reasons that caused teams to neglect divergent viewpoints and reduce the number of ideas being considered. Reasons found were the concurrence-seeking tendency, pressure on dissenters, self-censorship, and the illusion of invulnerability. It is suggested that understanding the dynamics behind the aforementioned reasons of groupthink will help project teams to avoid making premature group decisions by enhancing careful evaluation of available information and analysis of available decision alternatives and choices.

Keywords: groupthink, group decision, cohesiveness, project teams, mixed-methods research

Procedia PDF Downloads 400
13382 The Sensitivity of Electrical Geophysical Methods for Mapping Salt Stores within the Soil Profile

Authors: Fathi Ali Swaid

Abstract:

Soil salinization is one of the most hazardous phenomenons accelerating the land degradation processes. It either occurs naturally or is human-induced. High levels of soil salinity negatively affect crop growth and productivity leading land degradation ultimately. Thus, it is important to monitor and map soil salinity at an early stage to enact effective soil reclamation program that helps lessen or prevent future increase in soil salinity. Geophysical method has outperformed the traditional method for assessing soil salinity offering more informative and professional rapid assessment techniques for monitoring and mapping soil salinity. Soil sampling, EM38 and 2D conductivity imaging have been evaluated for their ability to delineate and map the level of salinity variations at Second Ponds Creek. The three methods have shown that the subsoil in the study area is saline. Salt variations were successfully observed under either method. However, EM38 reading and 2D inversion data show a clear spatial structure comparing to EC1:5 of soil samples in spite of that all soil samples, EM38 and 2D imaging were collected from the same location. Because EM38 readings and 2D imaging data are a weighted average of electrical soil conductance, it is more representative of soil properties than the soil samples method. The mapping of subsurface soil at the study area has been successful and the resistivity imaging has proven to be an advantage. The soil salinity analysis (EC1:5) correspond well to the true resistivity bringing together a good result of soil salinity. Soil salinity clearly indicated by previous investigation EM38 have been confirmed by the interpretation of the true resistivity at study area.

Keywords: 2D conductivity imaging, EM38 readings, soil salinization, true resistivity, urban salinity

Procedia PDF Downloads 381
13381 Evaluation of Electro-Flocculation for Biomass Production of Marine Microalgae Phaodactylum tricornutum

Authors: Luciana C. Ramos, Leandro J. Sousa, Antônio Ferreira da Silva, Valéria Gomes Oliveira Falcão, Suzana T. Cunha Lima

Abstract:

The commercial production of biodiesel using microalgae demands a high-energy input for harvesting biomass, making production economically unfeasible. Methods currently used involve mechanical, chemical, and biological procedures. In this work, a flocculation system is presented as a cost and energy effective process to increase biomass production of Phaeodactylum tricornutum. This diatom is the only species of the genus that present fast growth and lipid accumulation ability that are of great interest for biofuel production. The algae, selected from the Bank of Microalgae, Institute of Biology, Federal University of Bahia (Brazil), have been bred in tubular reactor with photoperiod of 12 h (clear/dark), providing luminance of about 35 μmol photons m-2s-1, and temperature of 22 °C. The medium used for growing cells was the Conway medium, with addition of silica. The seaweed growth curve was accompanied by cell count in Neubauer camera and by optical density in spectrophotometer, at 680 nm. The precipitation occurred at the end of the stationary phase of growth, 21 days after inoculation, using two methods: centrifugation at 5000 rpm for 5 min, and electro-flocculation at 19 EPD and 95 W. After precipitation, cells were frozen at -20 °C and, subsequently, lyophilized. Biomass obtained by electro-flocculation was approximately four times greater than the one achieved by centrifugation. The benefits of this method are that no addition of chemical flocculants is necessary and similar cultivation conditions can be used for the biodiesel production and pharmacological purposes. The results may contribute to improve biodiesel production costs using marine microalgae.

Keywords: biomass, diatom, flocculation, microalgae

Procedia PDF Downloads 331
13380 Developing a Cloud Intelligence-Based Energy Management Architecture Facilitated with Embedded Edge Analytics for Energy Conservation in Demand-Side Management

Authors: Yu-Hsiu Lin, Wen-Chun Lin, Yen-Chang Cheng, Chia-Ju Yeh, Yu-Chuan Chen, Tai-You Li

Abstract:

Demand-Side Management (DSM) has the potential to reduce electricity costs and carbon emission, which are associated with electricity used in the modern society. A home Energy Management System (EMS) commonly used by residential consumers in a down-stream sector of a smart grid to monitor, control, and optimize energy efficiency to domestic appliances is a system of computer-aided functionalities as an energy audit for residential DSM. Implementing fault detection and classification to domestic appliances monitored, controlled, and optimized is one of the most important steps to realize preventive maintenance, such as residential air conditioning and heating preventative maintenance in residential/industrial DSM. In this study, a cloud intelligence-based green EMS that comes up with an Internet of Things (IoT) technology stack for residential DSM is developed. In the EMS, Arduino MEGA Ethernet communication-based smart sockets that module a Real Time Clock chip to keep track of current time as timestamps via Network Time Protocol are designed and implemented for readings of load phenomena reflecting on voltage and current signals sensed. Also, a Network-Attached Storage providing data access to a heterogeneous group of IoT clients via Hypertext Transfer Protocol (HTTP) methods is configured to data stores of parsed sensor readings. Lastly, a desktop computer with a WAMP software bundle (the Microsoft® Windows operating system, Apache HTTP Server, MySQL relational database management system, and PHP programming language) serves as a data science analytics engine for dynamic Web APP/REpresentational State Transfer-ful web service of the residential DSM having globally-Advanced Internet of Artificial Intelligence (AI)/Computational Intelligence. Where, an abstract computing machine, Java Virtual Machine, enables the desktop computer to run Java programs, and a mash-up of Java, R language, and Python is well-suited and -configured for AI in this study. Having the ability of sending real-time push notifications to IoT clients, the desktop computer implements Google-maintained Firebase Cloud Messaging to engage IoT clients across Android/iOS devices and provide mobile notification service to residential/industrial DSM. In this study, in order to realize edge intelligence that edge devices avoiding network latency and much-needed connectivity of Internet connections for Internet of Services can support secure access to data stores and provide immediate analytical and real-time actionable insights at the edge of the network, we upgrade the designed and implemented smart sockets to be embedded AI Arduino ones (called embedded AIduino). With the realization of edge analytics by the proposed embedded AIduino for data analytics, an Arduino Ethernet shield WizNet W5100 having a micro SD card connector is conducted and used. The SD library is included for reading parsed data from and writing parsed data to an SD card. And, an Artificial Neural Network library, ArduinoANN, for Arduino MEGA is imported and used for locally-embedded AI implementation. The embedded AIduino in this study can be developed for further applications in manufacturing industry energy management and sustainable energy management, wherein in sustainable energy management rotating machinery diagnostics works to identify energy loss from gross misalignment and unbalance of rotating machines in power plants as an example.

Keywords: demand-side management, edge intelligence, energy management system, fault detection and classification

Procedia PDF Downloads 255
13379 Woodfuels as Alternative Source of Energy in Rural and Urban Areas in the Philippines

Authors: R. T. Aggangan

Abstract:

Woodfuels continue to be a major component of the energy supply mix of the Philippines due to increasing demand for energy that are not adequately met by decreasing supply and increasing prices of fuel oil such as liquefied petroleum gas (LPG) and kerosene. The Development Academy of the Philippines projects the demand of woodfuels in 2016 as 28.3 million metric tons in the household sector and about 105.4 million metric tons combined supply potentials of both forest and non-forest lands. However, the Revised Master Plan for Forestry Development projects a demand of about 50 million cu meters of fuelwood in 2016 but the capability to supply from local sources is only about 28 million cu meters indicating a 44 % deficiency. Household demand constitutes 82% while industries demand is 18%. Domestic household demand for energy is for cooking needs while the industrial demand is for steam power generation, curing barns of tobacco: brick, ceramics and pot making; bakery; lime production; and small scale food processing. Factors that favour increased use of wood-based energy include the relatively low prices (increasing oil-based fuel prices), availability of efficient wood-based energy utilization technology, increasing supply, and increasing population that cannot afford conventional fuels. Moreover, innovations in combustion technology and cogeneration of heat and power from biomass for modern applications favour biomass energy development. This paper recommends policies and strategic directions for the development of the woodfuel industry with the twin goals of sustainably supplying the energy requirements of households and industry.

Keywords: biomass energy development, fuelwood, households and industry, innovations in combustion technology, supply and demand

Procedia PDF Downloads 338
13378 Exploring the Spatial Relationship between Built Environment and Ride-hailing Demand: Applying Street-Level Images

Authors: Jingjue Bao, Ye Li, Yujie Qi

Abstract:

The explosive growth of ride-hailing has reshaped residents' travel behavior and plays a crucial role in urban mobility within the built environment. Contributing to the research of the spatial variation of ride-hailing demand and its relationship to the built environment and socioeconomic factors, this study utilizes multi-source data from Haikou, China, to construct a Multi-scale Geographically Weighted Regression model (MGWR), considering spatial scale heterogeneity. The regression results showed that MGWR model was demonstrated superior interpretability and reliability with an improvement of 3.4% on R2 and from 4853 to 4787 on AIC, compared with Geographically Weighted Regression model (GWR). Furthermore, to precisely identify the surrounding environment of sampling point, DeepLabv3+ model is employed to segment street-level images. Features extracted from these images are incorporated as variables in the regression model, further enhancing its rationality and accuracy by 7.78% improvement on R2 compared with the MGWR model only considered region-level variables. By integrating multi-scale geospatial data and utilizing advanced computer vision techniques, this study provides a comprehensive understanding of the spatial dynamics between ride-hailing demand and the urban built environment. The insights gained from this research are expected to contribute significantly to urban transportation planning and policy making, as well as ride-hailing platforms, facilitating the development of more efficient and effective mobility solutions in modern cities.

Keywords: travel behavior, ride-hailing, spatial relationship, built environment, street-level image

Procedia PDF Downloads 86
13377 Screening of Antiviral Compounds in Medicinal Plants: Non-Volatiles

Authors: Tomas Drevinskas, Ruta Mickiene, Audrius Maruska, Nicola Tiso, Algirdas Salomskas, Raimundas Lelesius, Agneta Karpovaite, Ona Ragazinskiene, Loreta Kubiliene

Abstract:

Antiviral effect of substances accumulated by plants and natural products is known to ethno-pharmacy and modern day medicine. Antiviral properties are usually assigned to volatile compounds and polyphenols. This research work is divided into several parts and the task of this part was to investigate potential plants, potential substances and potential preparation conditions that can be used for the preparation of antiviral agents. Sixteen different medicinal plants, their parts and two types of propolis were selected for screening. Firstly, extraction conditions of non-volatile compounds were investigated: 3 pre-selected plants were extracted with 5 different ethanol – water mixtures (96%, 75%, 60%, 40%, 20 %, vol.) and bidistilled water. Total phenolic content, total flavonoid content and radical scavenging activity was determined. The results indicated that optimal extrahent is 40%, vol. of ethanol – water mixture. Further investigations were performed with the extrahent of 40%, vol. ethanol – water mixture. All 16 of selected plants, their parts and two types of propolis were extracted using selected extrahent. Determined total phenolic content, total flavonoid content and radical scavenging activity indicated that extracts of Origanum Vulgare L., Mentha piperita L., Geranium macrorrhizum L., Melissa officinalis L. and Desmodium canadence L. contains highest amount of extractable phenolic compounds (7.31, 5.48, 7.88, 8.02 and 7.16 rutin equivalents (mg/ ml) respectively), flavonoid content (2.14, 2.23, 2.49, 0.79 and 1.51 rutin equivalents (mg/ml) respectively) and radical scavenging activity (11.98, 8.72, 13.47, 13.22 and 12.22 rutin equivalents (mg/ml) respectively). Composition of the extracts is analyzed using HPLC.

Keywords: antiviral effect, plants, propolis, phenols

Procedia PDF Downloads 329
13376 On the Existence of Homotopic Mapping Between Knowledge Graphs and Graph Embeddings

Authors: Jude K. Safo

Abstract:

Knowledge Graphs KG) and their relation to Graph Embeddings (GE) represent a unique data structure in the landscape of machine learning (relative to image, text and acoustic data). Unlike the latter, GEs are the only data structure sufficient for representing hierarchically dense, semantic information needed for use-cases like supply chain data and protein folding where the search space exceeds the limits traditional search methods (e.g. page-rank, Dijkstra, etc.). While GEs are effective for compressing low rank tensor data, at scale, they begin to introduce a new problem of ’data retreival’ which we observe in Large Language Models. Notable attempts by transE, TransR and other prominent industry standards have shown a peak performance just north of 57% on WN18 and FB15K benchmarks, insufficient practical industry applications. They’re also limited, in scope, to next node/link predictions. Traditional linear methods like Tucker, CP, PARAFAC and CANDECOMP quickly hit memory limits on tensors exceeding 6.4 million nodes. This paper outlines a topological framework for linear mapping between concepts in KG space and GE space that preserve cardinality. Most importantly we introduce a traceable framework for composing dense linguistic strcutures. We demonstrate performance on WN18 benchmark this model hits. This model does not rely on Large Langauge Models (LLM) though the applications are certainy relevant here as well.

Keywords: representation theory, large language models, graph embeddings, applied algebraic topology, applied knot theory, combinatorics

Procedia PDF Downloads 72
13375 Detecting Anomalous Matches: An Empirical Study from National Basketball Association

Authors: Jacky Liu, Dulani Jayasuriya, Ryan Elmore

Abstract:

Match fixing and anomalous sports events have increasingly threatened the integrity of professional sports, prompting concerns about existing detection methods. This study addresses prior research limitations in match fixing detection, improving the identification of potential fraudulent matches by incorporating advanced anomaly detection techniques. We develop a novel method to identify anomalous matches and player performances by examining series of matches, such as playoffs. Additionally, we investigate bettors' potential profits when avoiding anomaly matches and explore factors behind unusual player performances. Our literature review covers match fixing detection, match outcome forecasting models, and anomaly detection methods, underscoring current limitations and proposing a new sports anomaly detection method. Our findings reveal anomalous series in the 2022 NBA playoffs, with the Phoenix Suns vs Dallas Mavericks series having the lowest natural occurrence probability. We identify abnormal player performances and bettors' profits significantly decrease when post-season matches are included. This study contributes by developing a new approach to detect anomalous matches and player performances, and assisting investigators in identifying responsible parties. While we cannot conclusively establish reasons behind unusual player performances, our findings suggest factors such as team financial difficulties, executive mismanagement, and individual player contract issues.

Keywords: anomaly match detection, match fixing, match outcome forecasting, problematic players identification

Procedia PDF Downloads 84
13374 Architecture for QoS Based Service Selection Using Local Approach

Authors: Gopinath Ganapathy, Chellammal Surianarayanan

Abstract:

Services are growing rapidly and generally they are aggregated into a composite service to accomplish complex business processes. There may be several services that offer the same required function of a particular task in a composite service. Hence a choice has to be made for selecting suitable services from alternative functionally similar services. Quality of Service (QoS)plays as a discriminating factor in selecting which component services should be selected to satisfy the quality requirements of a user during service composition. There are two categories of approaches for QoS based service selection, namely global and local approaches. Global approaches are known to be Non-Polynomial (NP) hard in time and offer poor scalability in large scale composition. As an alternative to global methods, local selection methods which reduce the search space by breaking up the large/complex problem of selecting services for the workflow into independent sub problems of selecting services for individual tasks are coming up. In this paper, distributed architecture for selecting services based on QoS using local selection is presented with an overview of local selection methodology. The architecture describes the core components, namely, selection manager and QoS manager needed to implement the local approach and their functions. Selection manager consists of two components namely constraint decomposer which decomposes the given global or workflow level constraints in local or task level constraints and service selector which selects appropriate service for each task with maximum utility, satisfying the corresponding local constraints. QoS manager manages the QoS information at two levels namely, service class level and individual service level. The architecture serves as an implementation model for local selection.

Keywords: architecture of service selection, local method for service selection, QoS based service selection, approaches for QoS based service selection

Procedia PDF Downloads 428
13373 Virtual Team Management in Companies and Organizations

Authors: Asghar Zamani, Mostafa Falahmorad

Abstract:

Virtualization is established to combine and use the unique capabilities of employees to increase productivity and agility to provide services regardless of location. Adapting to fast and continuous change and getting maximum access to human resources are reasons why virtualization is happening. The distance problem is solved by information. Flexibility is the most important feature of virtualization, and information will be the main focus of virtualized companies. In this research, we used the Covid-19 opportunity window to assess the productivity of the companies that had been going through more virtualized management before the Covid-19 in comparison with those that just started planning on developing infrastructures on virtual management after the crises of pandemic occurred. The research process includes financial (profitability and customer satisfaction) and behavioral (organizational culture and reluctance to change) metrics assessment. In addition to financial and CRM KPIs, a questionnaire is devised to assess how manager and employees’ attitude has been changing towards the migration to virtualization. The sample companies and questions are selected by asking from experts in the IT industry of Iran. In this article, the conclusion is that companies open to virtualization based on accurate strategic planning or willing to pay to train their employees for virtualization before the pandemic are more agile in adapting to change and moving forward in recession. The prospective companies in this research, not only could compensate for the short period loss from the first shock of the Covid-19, but they could also foresee new needs of their customer sooner than other competitors, resulting in the need to employ new staff for executing the emerging demands. Findings were aligned with the literature review. Results can be a wake-up call for business owners especially in developing countries to be more resilient toward modern management styles instead of continuing with traditional ones.

Keywords: virtual management, virtual organization, competitive advantage, KPI, profit

Procedia PDF Downloads 86
13372 Ecosystem Model for Environmental Applications

Authors: Cristina Schreiner, Romeo Ciobanu, Marius Pislaru

Abstract:

This paper aims to build a system based on fuzzy models that can be implemented in the assessment of ecological systems, to determine appropriate methods of action for reducing adverse effects on environmental and implicit the population. The model proposed provides new perspective for environmental assessment, and it can be used as a practical instrument for decision-making.

Keywords: ecosystem model, environmental security, fuzzy logic, sustainability of habitable regions

Procedia PDF Downloads 427
13371 Neuroecological Approach for Anthropological Studies in Archaeology

Authors: Kalangi Rodrigo

Abstract:

The term Neuroecology elucidates the study of customizable variation in cognition and the brain. Subject marked the birth since 1980s, when researches began to apply methods of comparative evolutionary biology to cognitive processes and the underlying neural mechanisms of cognition. In Archaeology and Anthropology, we observe behaviors such as social learning skills, innovative feeding and foraging, tool use and social manipulation to determine the cognitive processes of ancient mankind. Depending on the brainstem size was used as a control variable, and phylogeny was controlled using independent contrasts. Both disciplines need to enriched with comparative literature and neurological experimental, behavioral studies among tribal peoples as well as primate groups which will lead the research to a potential end. Neuroecology examines the relations between ecological selection pressure and mankind or sex differences in cognition and the brain. The goal of neuroecology is to understand how natural law acts on perception and its neural apparatus. Furthermore, neuroecology will eventually lead both principal disciplines to Ethology, where human behaviors and social management studies from a biological perspective. It can be either ethnoarchaeological or prehistoric. Archaeology should adopt general approach of neuroecology, phylogenetic comparative methods can be used in the field, and new findings on the cognitive mechanisms and brain structures involved mating systems, social organization, communication and foraging. The contribution of neuroecology to archaeology and anthropology is the information it provides on the selective pressures that have influenced the evolution of cognition and brain structure of the mankind. It will shed a new light to the path of evolutionary studies including behavioral ecology, primate archaeology and cognitive archaeology.

Keywords: Neuroecology, Archaeology, Brain Evolution, Cognitive Archaeology

Procedia PDF Downloads 124
13370 Determination of Geotechnical Properties of Travertine Lithotypes in Van-Turkey

Authors: Ali Ozvan, Ismail Akkaya, Mucip Tapan

Abstract:

Travertine is generally a weak or medium strong rock, and physical, mechanical and structural properties of travertines are direct impacts on geotechnical studies. New settlement areas were determined on travertine units after two destructive earthquakes which occurred on October 23rd, 2011 (M=7.1) and November 9th, 2011 (M=5.6) in Tabanlı and Edremit districts of Van province in Turkey, respectively. In the study area, the travertines have different lithotype and engineering properties such as strong crystalline crust, medium strong shrub, and weak reed which can affect mechanical and engineering properties of travertine and each level have different handicaps. Travertine has a higher strength when compared to the soil ground; however, it can have different handicaps such as having poor rock mass, karst caves and weathering alteration. Physico-mechanical properties of travertine in the study area are determined by laboratory tests and field observations. Uniaxial compressive strength (UCS) values were detected by indirect methods, and the strength map of different lithotype of Edremit travertine was created in order to define suitable settlement areas. Also, rock mass properties and underground structure were determined by bore holes, field studies, and geophysical method. The reason of this study is to investigate the relationship between lithotype and physicomechanical properties of travertines. According to the results, lithotype has an effect on physical, mechanical and rock mass properties of travertine levels. It is detected by several research methods that various handicaps may occur on such areas when the active tectonic structure of the area is evaluated along with the karstic cavities within the travertine and different lithotype qualities.

Keywords: travertine, lithotype, geotechnical parameters, Van earthquake

Procedia PDF Downloads 234
13369 Barnard Feature Point Detector for Low-Contractperiapical Radiography Image

Authors: Chih-Yi Ho, Tzu-Fang Chang, Chih-Chia Huang, Chia-Yen Lee

Abstract:

In dental clinics, the dentists use the periapical radiography image to assess the effectiveness of endodontic treatment of teeth with chronic apical periodontitis. Periapical radiography images are taken at different times to assess alveolar bone variation before and after the root canal treatment, and furthermore to judge whether the treatment was successful. Current clinical assessment of apical tissue recovery relies only on dentist personal experience. It is difficult to have the same standard and objective interpretations due to the dentist or radiologist personal background and knowledge. If periapical radiography images at the different time could be registered well, the endodontic treatment could be evaluated. In the image registration area, it is necessary to assign representative control points to the transformation model for good performances of registration results. However, detection of representative control points (feature points) on periapical radiography images is generally very difficult. Regardless of which traditional detection methods are practiced, sufficient feature points may not be detected due to the low-contrast characteristics of the x-ray image. Barnard detector is an algorithm for feature point detection based on grayscale value gradients, which can obtain sufficient feature points in the case of gray-scale contrast is not obvious. However, the Barnard detector would detect too many feature points, and they would be too clustered. This study uses the local extrema of clustering feature points and the suppression radius to overcome the problem, and compared different feature point detection methods. In the preliminary result, the feature points could be detected as representative control points by the proposed method.

Keywords: feature detection, Barnard detector, registration, periapical radiography image, endodontic treatment

Procedia PDF Downloads 445
13368 Bedouin Dialects: Language Use and Identity Perceptions of Bedouin-Speaking University Students in North-Western Saudi Arabia and Implications for Language Vitality

Authors: Hend Albalawi

Abstract:

Amid the dynamic use of the Arabic language worldwide, Saudi Arabia employs Modern Standard Arabic (MSA) as its formal, official language, whereas other dialects of Arabic are common in informal situations. Such trends not only maintain the powerful, state-supported status of MSA but are liable to also affect the use and status of other varieties, including Bedouin dialects, and prompt code-mixing behaviour among their speakers. Exposure to MSA and English in education in Saudi Arabia may also be liable to reduce the vitality of Bedouin dialects in the country, particularly among current generations of educated Bedouin speakers. Therefore, the proposed research will involve examining the perceived vitality of Bedouin dialects in Saudi language policies prescribing MSA as the official national language of Saudi Arabia and requiring university students to complete English-language coursework in the national education system. It will also entail identifying Bedouin speakers’ attitudes towards the use of Bedouin dialects in order to assess the need, if any, to implement policies in Saudi Arabia that can enhance the use of those dialects amid the competing use of MSA and English in the country. Empirical data collected from questionnaires and semi-structured interviews that purport patterns of the everyday use of languages among Bedouin-speaking university students in Tabuk, as well as the content of language policy documents, can clarify whether policy-based pressure to use MSA and English in mainstream educational and social activities in Saudi Arabia has jeopardised the language vitality of Bedouin dialects in north-west Saudi Arabia. The findings of the research can thus ultimately contribute to the development of policies to support and enhance the use of Bedouin dialects and, in turn, their language vitality.

Keywords: attitudes, Bedouin dialects, language policy, vitality

Procedia PDF Downloads 124
13367 The Population Death Model and Influencing Factors from the Data of The "Sixth Census": Zhangwan District Case Study

Authors: Zhou Shangcheng, Yi Sicen

Abstract:

Objective: To understand the mortality patterns of Zhangwan District in 2010 and provide the basis for the development of scientific and rational health policy. Methods: Data are collected from the Sixth Census of Zhangwan District and disease surveillance system. The statistical analysis include death difference between age, gender, region and time and the related factors. Methods developed for the Global Burden of Disease (GBD) Study by the World Bank and World Health Organization (WHO) were adapted and applied to Zhangwan District population health data. DALY rate per 1,000 was calculated for varied causes of death. SPSS 16 is used by statistic analysis. Results: From the data of death population of Zhangwan District we know the crude mortality rate was 6.03 ‰. There are significant differences of mortality rate in male and female population which was respectively 7.37 ‰ and 4.68 ‰. 0 age group population life expectancy in Zhangwan District in 2010 was 78.40 years old(Male 75.93, Female 81.03). The five leading causes of YLL in descending order were: cardiovascular diseases(42.63DALY/1000), malignant neoplasm (23.73DALY/1000), unintentional injuries (5.84DALY/1000), Respiratory diseases(5.43 DALY/1000), Respiratory infections (2.44DALY/1000). In addition, there are strong relation between the marital status , educational level and mortality in some to a certain extend. Conclusion Zhangwan District, as city level, is at lower mortality levels. The mortality of the total population of Zhangwan District has a downward trend and life expectancy is rising.

Keywords: sixth census, Zhangwan district, death level differences, influencing factors, cause of death

Procedia PDF Downloads 274
13366 Microstructure Evolution and Pre-transformation Microstructure Reconstruction in Ti-6Al-4V Alloy

Authors: Shreyash Hadke, Manendra Singh Parihar, Rajesh Khatirkar

Abstract:

In the present investigation, the variation in the microstructure with the changes in the heat treatment conditions i.e. temperature and time was observed. Ti-6Al-4V alloy was subject to solution annealing treatments in β (1066C) and α+β phase (930C and 850C) followed by quenching, air cooling and furnace cooling to room temperature respectively. The effect of solution annealing and cooling on the microstructure was studied by using optical microscopy (OM), scanning electron microscopy (SEM), electron backscattered diffraction (EBSD) and x-ray diffraction (XRD). The chemical composition of the β phase for different conditions was determined with the help of energy dispersive spectrometer (EDS) attached to SEM. Furnace cooling resulted in the development of coarser structure (α+β), while air cooling resulted in much finer structure with widmanstatten morphology of α at the grain boundaries. Quenching from solution annealing temperature formed α’ martensite, their proportion being dependent on the temperature in β phase field. It is well known that the transformation of β to α follows Burger orientation relationship (OR). In order to reconstruct the microstructure of parent β phase, a MATLAB code was written using neighbor-to-neighbor, triplet method and Tari’s method. The code was tested on the annealed samples (1066C solution annealing temperature followed by furnace cooling to room temperature). The parent phase data thus generated was then plotted using the TSL-OIM software. The reconstruction results of the above methods were compared and analyzed. The Tari’s approach (clustering approach) gave better results compared to neighbor-to-neighbor and triplet method but the time taken by the triplet method was least compared to the other two methods.

Keywords: Ti-6Al-4V alloy, microstructure, electron backscattered diffraction, parent phase reconstruction

Procedia PDF Downloads 451
13365 Effects of Centrifugation, Encapsulation Method and Different Coating Materials on the Total Antioxidant Activity of the Microcapsules of Powdered Cherry Laurels

Authors: B. Cilek Tatar, G. Sumnu, M. Oztop, E. Ayaz

Abstract:

Encapsulation protects sensitive food ingredients against heat, oxygen, moisture and pH until they are released to the system. It can mask the unwanted taste of nutrients that are added to the foods for fortification purposes. Cherry laurels (Prunus laurocerasus) contain phenolic compounds which decrease the proneness to several chronic diseases such as types of cancer and cardiovascular diseases. The objective of this research was to study the effects of centrifugation, different coating materials and homogenization methods on microencapsulation of powders obtained from cherry laurels. In this study, maltodextrin and mixture of maltodextrin:whey protein with a ratio of 1:3 (w/w) were chosen as coating materials. Total solid content of coating materials was kept constant as 10% (w/w). Capsules were obtained from powders of freeze-dried cherry laurels through encapsulation process by silent crusher homogenizer or microfluidization. Freeze-dried cherry laurels were core materials and core to coating ratio was chosen as 1:10 by weight. To homogenize the mixture, high speed homogenizer was used at 4000 rpm for 5 min. Then, silent crusher or microfluidizer was used to complete encapsulation process. The mixtures were treated either by silent crusher for 1 min at 75000 rpm or microfluidizer at 50 MPa for 3 passes. Freeze drying for 48 hours was applied to emulsions to obtain capsules in powder form. After these steps, dry capsules were grounded manually into a fine powder. The microcapsules were analyzed for total antioxidant activity with DPPH (1,1-diphenyl-2-picrylhydrazyl) radical scavenging method. Prior to high speed homogenization, the samples were centrifuged (4000 rpm, 1 min). Centrifugation was found to have positive effect on total antioxidant activity of capsules. Microcapsules treated by microfluidizer were found to have higher total antioxidant activities than those treated by silent crusher. It was found that increasing whey protein concentration in coating material (using maltodextrin:whey protein 1:3 mixture) had positive effect on total antioxidant activity for both silent crusher and microfluidization methods. Therefore, capsules prepared by microfluidization of centrifuged mixtures can be selected as the best conditions for encapsulation of cherry laurel powder by considering their total antioxidant activity. In this study, it was shown that capsules prepared by these methods can be recommended to be incorporated into foods in order to enhance their functionality by increasing antioxidant activity.

Keywords: antioxidant activity, cherry laurel, microencapsulation, microfluidization

Procedia PDF Downloads 297
13364 The Use of Information and Communication Technology within and between Emergency Medical Teams during a Disaster: A Qualitative study

Authors: Badryah Alshehri, Kevin Gormley, Gillian Prue, Karen McCutcheon

Abstract:

In a disaster event, sharing patient information between the pre-hospital Emergency Medical Services (EMS) and Emergency Department (ED) hospitals is a complex process during which important information may be altered or lost due to poor communication. The aim of this study was to critically discuss the current evidence base in relation to communication between pre- EMS hospital and ED hospital professionals by the use of Information and Communication Systems (ICT). This study followed the systematic approach; six electronic databases were searched: CINAHL, Medline, Embase, PubMed, Web of Science, and IEEE Xplore Digital Library were comprehensively searched in January 2018 and a second search was completed in April 2020 to capture more recent publications. The study selection process was undertaken independently by the study authors. Both qualitative and quantitative studies were chosen that focused on factors that are positively or negatively associated with coordinated communication between pre-hospital EMS and ED teams in a disaster event. These studies were assessed for quality, and the data were analyzed according to the key screening themes which emerged from the literature search. Twenty-two studies were included. Eleven studies employed quantitative methods, seven studies used qualitative methods, and four studies used mixed methods. Four themes emerged on communication between EMTs (pre-hospital EMS and ED staff) in a disaster event using the ICT. (1) Disaster preparedness plans and coordination. This theme reported that disaster plans are in place in hospitals, and in some cases, there are interagency agreements with pre-hospital and relevant stakeholders. However, the findings showed that the disaster plans highlighted in these studies lacked information regarding coordinated communications within and between the pre-hospital and hospital. (2) Communication systems used in the disaster. This theme highlighted that although various communication systems are used between and within hospitals and pre-hospitals, technical issues have influenced communication between teams during disasters. (3) Integrated information management systems. This theme suggested the need for an integrated health information system that can help pre-hospital and hospital staff to record patient data and ensure the data is shared. (4) Disaster training and drills. While some studies analyzed disaster drills and training, the majority of these studies were focused on hospital departments other than EMTs. These studies suggest the need for simulation disaster training and drills, including EMTs. This review demonstrates that considerable gaps remain in the understanding of the communication between the EMS and ED hospital staff in relation to response in disasters. The review shows that although different types of ICTs are used, various issues remain which affect coordinated communication among the relevant professionals.

Keywords: emergency medical teams, communication, information and communication technologies, disaster

Procedia PDF Downloads 129
13363 Fractal Nature of Granular Mixtures of Different Concretes Formulated with Different Methods of Formulation

Authors: Fatima Achouri, Kaddour Chouicha, Abdelwahab Khatir

Abstract:

It is clear that concrete of quality must be made with selected materials chosen in optimum proportions that remain after implementation, a minimum of voids in the material produced. The different methods of formulations what we use, are based for the most part on a granular curve which describes an ‘optimal granularity’. Many authors have engaged in fundamental research on granular arrangements. A comparison of mathematical models reproducing these granular arrangements with experimental measurements of compactness have to verify that the minimum porosity P according to the following extent granular exactly a power law. So the best compactness in the finite medium are obtained with power laws, such as Furnas, Fuller or Talbot, each preferring a particular setting between 0.20 and 0.50. These considerations converge on the assumption that the optimal granularity Caquot approximates by a power law. By analogy, it can then be analyzed as a granular structure of fractal-type since the properties that characterize the internal similarity fractal objects are reflected also by a power law. Optimized mixtures may be described as a series of installments falling granular stuff to better the tank on a regular hierarchical distribution which would give at different scales, by cascading effects, the same structure to the mix. Likely this model may be appropriate for the entire extent of the size distribution of the components, since the cement particles (and silica fume) correctly deflocculated, micrometric dimensions, to chippings sometimes several tens of millimeters. As part of this research, the aim is to give an illustration of the application of fractal analysis to characterize the granular concrete mixtures optimized for a so-called fractal dimension where different concretes were studying that we proved a fractal structure of their granular mixtures regardless of the method of formulation or the type of concrete.

Keywords: concrete formulation, fractal character, granular packing, method of formulation

Procedia PDF Downloads 264
13362 Text Mining of Veterinary Forums for Epidemiological Surveillance Supplementation

Authors: Samuel Munaf, Kevin Swingler, Franz Brülisauer, Anthony O’Hare, George Gunn, Aaron Reeves

Abstract:

Web scraping and text mining are popular computer science methods deployed by public health researchers to augment traditional epidemiological surveillance. However, within veterinary disease surveillance, such techniques are still in the early stages of development and have not yet been fully utilised. This study presents an exploration into the utility of incorporating internet-based data to better understand the smallholder farming communities within Scotland by using online text extraction and the subsequent mining of this data. Web scraping of the livestock fora was conducted in conjunction with text mining of the data in search of common themes, words, and topics found within the text. Results from bi-grams and topic modelling uncover four main topics of interest within the data pertaining to aspects of livestock husbandry: feeding, breeding, slaughter, and disposal. These topics were found amongst both the poultry and pig sub-forums. Topic modeling appears to be a useful method of unsupervised classification regarding this form of data, as it has produced clusters that relate to biosecurity and animal welfare. Internet data can be a very effective tool in aiding traditional veterinary surveillance methods, but the requirement for human validation of said data is crucial. This opens avenues of research via the incorporation of other dynamic social media data, namely Twitter and Facebook/Meta, in addition to time series analysis to highlight temporal patterns.

Keywords: veterinary epidemiology, disease surveillance, infodemiology, infoveillance, smallholding, social media, web scraping, sentiment analysis, geolocation, text mining, NLP

Procedia PDF Downloads 105