Search results for: data exchange
23175 Assessment of Rainfall Erosivity, Comparison among Methods: Case of Kakheti, Georgia
Authors: Mariam Tsitsagi, Ana Berdzenishvili
Abstract:
Rainfall intensity change is one of the main indicators of climate change. It has a great influence on agriculture as one of the main factors causing soil erosion. Splash and sheet erosion are one of the most prevalence and harmful for agriculture. It is invisible for an eye at first stage, but the process will gradually move to stream cutting erosion. Our study provides the assessment of rainfall erosivity potential with the use of modern research methods in Kakheti region. The region is the major provider of wheat and wine in the country. Kakheti is located in the eastern part of Georgia and characterized quite a variety of natural conditions. The climate is dry subtropical. For assessment of the exact rate of rainfall erosion potential several year data of rainfall with short intervals are needed. Unfortunately, from 250 active metro stations running during the Soviet period only 55 of them are active now and 5 stations in Kakheti region respectively. Since 1936 we had data on rainfall intensity in this region, and rainfall erosive potential is assessed, in some old papers, but since 1990 we have no data about this factor, which in turn is a necessary parameter for determining the rainfall erosivity potential. On the other hand, researchers and local communities suppose that rainfall intensity has been changing and the number of haily days has also been increasing. However, finding a method that will allow us to determine rainfall erosivity potential as accurate as possible in Kakheti region is very important. The study period was divided into three sections: 1936-1963; 1963-1990 and 1990-2015. Rainfall erosivity potential was determined by the scientific literature and old meteorological stations’ data for the first two periods. And it is known that in eastern Georgia, at the boundary between steppe and forest zones, rainfall erosivity in 1963-1990 was 20-75% higher than that in 1936-1963. As for the third period (1990-2015), for which we do not have data of rainfall intensity. There are a variety of studies, where alternative ways of calculating the rainfall erosivity potential based on lack of data are discussed e.g.based on daily rainfall data, average annual rainfall data and the elevation of the area, etc. It should be noted that these methods give us a totally different results in case of different climatic conditions and sometimes huge errors in some cases. Three of the most common methods were selected for our research. Each of them was tested for the first two sections of the study period. According to the outcomes more suitable method for regional climatic conditions was selected, and after that, we determined rainfall erosivity potential for the third section of our study period with use of the most successful method. Outcome data like attribute tables and graphs was specially linked to the database of Kakheti, and appropriate thematic maps were created. The results allowed us to analyze the rainfall erosivity potential changes from 1936 to the present and make the future prospect. We have successfully implemented a method which can also be use for some another region of Georgia.Keywords: erosivity potential, Georgia, GIS, Kakheti, rainfall
Procedia PDF Downloads 22923174 Pragmatic Development of Chinese Sentence Final Particles via Computer-Mediated Communication
Authors: Qiong Li
Abstract:
This study investigated in which condition computer-mediated communication (CMC) could promote pragmatic development. The focal feature included four Chinese sentence final particles (SFPs), a, ya, ba, and ne. They occur frequently in Chinese, and function as mitigators to soften the tone of speech. However, L2 acquisition of SFPs is difficult, suggesting the necessity of additional exposure to or explicit instruction on Chinese SFPs. This study follows this line and aims to explore two research questions: (1) Is CMC combined with data-driven instruction more effective than CMC alone in promoting L2 Chinese learners’ SFP use? (2) How does L2 Chinese learners’ SFP use change over time, as compared to the production of native Chinese speakers? The study involved 19 intermediate-level learners of Chinese enrolled at a private American university. They were randomly assigned to two groups: (1) the control group (N = 10), which was exposed to SFPs through CMC alone, (2) the treatment group (N = 9), which was exposed to SFPs via CMC and data-driven instruction. Learners interacted with native speakers on given topics through text-based CMC over Skype. Both groups went through six 30-minute CMC sessions on a weekly basis, with a one-week interval after the first two CMC sessions and a two-week interval after the second two CMC sessions (nine weeks in total). The treatment group additionally received a data-driven instruction after the first two sessions. Data analysis focused on three indices: token frequency, type frequency, and acceptability of SFP use. Token frequency was operationalized as the raw occurrence of SFPs per clause. Type frequency was the range of SFPs. Acceptability was rated by two native speakers using a rating rubric. The results showed that the treatment group made noticeable progress over time on the three indices. The production of SFPs approximated the native-like level. In contrast, the control group only slightly improved on token frequency. Only certain SFPs (a and ya) reached the native-like use. Potential explanations for the group differences were discussed in two aspects: the property of Chinese SFPs and the role of CMC and data-driven instruction. Though CMC provided the learners with opportunities to notice and observe SFP use, as a feature with low saliency, SFPs were not easily noticed in input. Data-driven instruction in the treatment group directed the learners’ attention to these particles, which facilitated the development.Keywords: computer-mediated communication, data-driven instruction, pragmatic development, second language Chinese, sentence final particles
Procedia PDF Downloads 42123173 Forecasting Cancers Cases in Algeria Using Double Exponential Smoothing Method
Authors: Messis A., Adjebli A., Ayeche R., Talbi M., Tighilet K., Louardiane M.
Abstract:
Cancers are the second cause of death worldwide. Prevalence and incidence of cancers is getting increased by aging and population growth. This study aims to predict and modeling the evolution of breast, Colorectal, Lung, Bladder and Prostate cancers over the period of 2014-2019. In this study, data were analyzed using time series analysis with double exponential smoothing method to forecast the future pattern. To describe and fit the appropriate models, Minitab statistical software version 17 was used. Between 2014 and 2019, the overall trend in the raw number of new cancer cases registered has been increasing over time; the change in observations over time has been increasing. Our forecast model is validated since we have good prediction for the period 2020 and data not available for 2021 and 2022. Time series analysis showed that the double exponential smoothing is an efficient tool to model the future data on the raw number of new cancer cases.Keywords: cancer, time series, prediction, double exponential smoothing
Procedia PDF Downloads 9323172 Mutual Information Based Image Registration of Satellite Images Using PSO-GA Hybrid Algorithm
Authors: Dipti Patra, Guguloth Uma, Smita Pradhan
Abstract:
Registration is a fundamental task in image processing. It is used to transform different sets of data into one coordinate system, where data are acquired from different times, different viewing angles, and/or different sensors. The registration geometrically aligns two images (the reference and target images). Registration techniques are used in satellite images and it is important in order to be able to compare or integrate the data obtained from these different measurements. In this work, mutual information is considered as a similarity metric for registration of satellite images. The transformation is assumed to be a rigid transformation. An attempt has been made here to optimize the transformation function. The proposed image registration technique hybrid PSO-GA incorporates the notion of Particle Swarm Optimization and Genetic Algorithm and is used for finding the best optimum values of transformation parameters. The performance comparision obtained with the experiments on satellite images found that the proposed hybrid PSO-GA algorithm outperforms the other algorithms in terms of mutual information and registration accuracy.Keywords: image registration, genetic algorithm, particle swarm optimization, hybrid PSO-GA algorithm and mutual information
Procedia PDF Downloads 41323171 Social Movements of Central-Eastern Europe: Examining Trends of Cooperation and Antagonism by Using Big Data
Authors: Reka Zsuzsanna Mathe
Abstract:
The globalization and the Europeanization have significantly contributed to a change in the role of the nation-states. The global economic crisis, the climate changes, and the recent refugee crisis, are just a few among many challenges that cannot be effectively addressed by the traditional role of the nation-states. One of the main roles of the states is to solve collective action problems, however due to their changing roles; apparently this is getting more and more difficult. Depending on political culture, collective action problems are solved either through cooperation or conflict. The political culture of Central and Eastern European (CEE) countries is marked by low civic participation and by a weak civil society. In this type of culture collective action problems are likely to be induced through conflict, rather than the democratic process of dialogue and any type of social change is probably to be introduced by social movements. Several studies have been conducted on the social movements of the CEE countries, yet, it is still not clear if the most significant social movements of the region tend to choose rather the cooperative or the conflictual way as action strategy. This study differentiates between a national and a European action field, having different social orders. The actors of the two fields are the broadly understood civil society members, conceptualized as social movements. This research tries to answer the following questions: a) What are the norms that best characterize the CEE countries’ social order? b) What type of actors would prefer a change and in which areas? c) Is there a significant difference between the main actors active in the national versus the European field? The main hypotheses are that there are conflicting norms defining the national and the European action field, and there is a significant difference between the action strategies adopted by social movements acting in the two different fields. In mapping the social order, the study uses data provided by the European Social Survey. Big data of the Global Data on Events, Location and Tone (GDELT) database offers information regarding the main social movements and their preferred type of action. The unit of the analysis is the so called ‘Visegrad 4’ countries: Poland, Czech Republic, Slovakia and Hungary and the research uses data starting from 2005 (after the European accession of these four countries) until May, 2017. According to the data, the main hypotheses were confirmed.Keywords: big data, Central and Eastern Europe, civil society, GDELT, social movements
Procedia PDF Downloads 16423170 Effectiveness of Cold Calling on Students’ Behavior and Participation during Class Discussions: Punishment or Opportunity to Shine
Authors: Maimuna Akram, Khadija Zia, Sohaib Naseer
Abstract:
Pedagogical objectives and the nature of the course content may lead instructors to take varied approaches to selecting a student for the cold call, specifically in a studio setup where students work on different projects independently and show progress work time to time at scheduled critiques. Cold-calling often proves to be an effective tool in eliciting a response without enforcing judgment onto the recipients. While there is a mixed range of behavior exhibited by students who are cold-called, a classification of responses from anxiety-provoking to inspiring may be elicited; there is a need for a greater understanding of utilizing the exchanges in bringing about fruitful and engaging outcomes of studio discussions. This study aims to unravel the dimensions of utilizing the cold-call approach in a didactic exchange within studio pedagogy. A questionnaire survey was conducted in an undergraduate class at Arts and Design School. The impact of cold calling on students’ participation was determined through various parameters, including course choice, participation frequency, students’ comfortability, and teaching methodology. After analyzing the surveys, specific classroom teachers were interviewed to provide a qualitative perspective of the faculty. It was concluded that cold-calling increases students’ participation frequency and also increases preparation for class. Around 67% of students responded that teaching methods play an important role in learning activities and students’ participation during class discussions. 84% of participants agreed that cold calling is an effective way of learning. According to research, cold-calling can be done in large numbers without making students uncomfortable. As a result, the findings of this study support the use of this instructional method to encourage more students to participate in class discussions.Keywords: active learning, class discussion, class participation, cold calling, pedagogical methods, student engagement
Procedia PDF Downloads 4123169 Class Control Management Issues and Solutions in Interactive Learning Theories’ Efficiency and the Application Case Study: 3rd Year Primary School
Authors: Mohammed Belalia Douma
Abstract:
Interactive learning is considered as the most effective strategy of learning, it is an educational philosophy based on the learner's contribution and involvement mainly in classroom and how he interacts toward his small society “classroom”, and the level of his collaboration into challenge, discovering, games, participation, all these can be provided through the interactive learning, which aims to activate the learner's role in the operation of learning, which focuses on research and experimentation, and the learner's self-reliance in obtaining information, acquiring skills, and forming values and attitudes. Whereas not based on memorization only, but rather on developing thinking and the ability to solve problems, on teamwork and collaborative learning. With the exchange or roles - teacher to student- , when the student will be more active and performing operations more than the student under the interactive learning method; we might face a several issues dealing with class controlling management, noise, and stability of learning… etc. This research paper is observing the application of the interactive learning on reality “classroom” and answers several assumptions and analyzes the issues coming up of these strategies mainly: noise, class control…etc The research sample was about 150 student of the 3rd year primary school in “Chlef” district, Algeria, level: beginners in the range of age 08 to 10 years old . We provided a questionnaire of confidential fifteen questions and also analyzing the attitudes of learners during three months. it have witnessed as teachers a variety of strategies dealing with applying the interactive learning but with a different issues; time management, noise, uncontrolled classes, overcrowded classes. Finally, it summed up that although the active education is an inevitably effective method of teaching, however, there are drawbacks to this, in addition to the fact that not all theoretical strategies can be applied and we conclude with solutions of this case study.Keywords: interactive learning, student, learners, strategies.
Procedia PDF Downloads 6323168 Vascular Foramina of the Capitate Bone of the Hand – an Anatomical Study
Authors: Latha V. Prabhu, B.V. Murlimanju, P.J. Jiji, Mangala M. Pai
Abstract:
Background: The capitate is the largest among the carpal bones. There exists no literature about the vascular foramina of the capitate bone. The objective of the present study was to investigate the morphology and number of the nutrient foramina in the cadaveric dried capitate bones of the Indian population. Methods: The present study included 59 capitate bones (25 right sided and 34 left sided) which were obtained from the gross anatomy laboratory of our institution. The bones were macroscopically observed for the nutrient foramina and the data was collected with respect to their number. The tabulation of the data and analysis were done. Results: All of our specimens (100%) exhibited the nutrient foramina over the non-articular and articular surfaces. The foramina were observed at the medial, lateral, palmar and dorsal surfaces of the capitate bones. The foramina were ranged from 6 to 23 in each capitate bone. In the medial surface, the foramina ranged from 1 to 6, lateral surface from 0 to 7, the foramina ranged between 0 and 5 in the palmar surface. However most of the foramina were located at the dorsal surface which ranged from 3 to 11. Conclusion: We believe that the present study has provided additional data about the nutrient foramina of the capitate bones. The data is enlightening to the orthopedic surgeon and would help in the hand surgeries. The knowledge about the foramina is also important to the radiologists to prevent the misinterpretation of the findings in the x ray and computed tomogram scan films. The foramina may mimick like erosions and ossicles. The morphological knowledge of the vasculature, their foramina of entry and number is required to understand the concepts in the avascular necrosis of the capitate.Keywords: avascular necrosis, capitate, morphology, nutrient foramen
Procedia PDF Downloads 34623167 Development and Validation of a Semi-Quantitative Food Frequency Questionnaire for Use in Urban and Rural Communities of Rwanda
Authors: Phenias Nsabimana, Jérôme W. Some, Hilda Vasanthakaalam, Stefaan De Henauw, Souheila Abbeddou
Abstract:
Tools for the dietary assessment in adults are limited in low- and middle-income settings. The objective of this study was to develop and validate a semi-quantitative food frequency questionnaire (FFQ) against the multiple pass-24 h recall tool for use in urban and rural Rwanda. A total of 212 adults (154 females and 58 males), 18-49 aged, including 105 urban and 107 rural residents, from the four regions of Rwanda, were recruited in the present study. A multiple-pass 24- H recall technique was used to collect dietary data in both urban and rural areas in four different rounds, on different days (one weekday and one weekend day), separated by a period of three months, from November 2020 to October 2021. The details of all the foods and beverages consumed over the 24h period of the day prior to the interview day were collected during face-to-face interviews. A list of foods, beverages, and commonly consumed recipes was developed by the study researchers and ten research assistants from the different regions of Rwanda. Non-standard recipes were collected when the information was available. A single semi-quantitative FFQ was also developed in the same group discussion prior to the beginning of the data collection. The FFQ was collected at the beginning and the end of the data collection period. Data were collected digitally. The amount of energy and macro-nutrients contributed by each food, recipe, and beverage will be computed based on nutrient composition reported in food composition tables and weight consumed. Median energy and nutrient contents of different food intakes from FFQ and 24-hour recalls and median differences (24-hour recall –FFQ) will be calculated. Kappa, Spearman, Wilcoxon, and Bland-Altman plot statistics will be conducted to evaluate the correlation between estimated nutrient and energy intake found by the two methods. Differences will be tested for their significance and all analyses will be done with STATA 11. Data collection was completed in November 2021. Data cleaning is ongoing and the data analysis is expected to be completed by July 2022. A developed and validated semi-quantitative FFQ will be available for use in dietary assessment. The developed FFQ will help researchers to collect reliable data that will support policy makers to plan for proper dietary change intervention in Rwanda.Keywords: food frequency questionnaire, reproducibility, 24-H recall questionnaire, validation
Procedia PDF Downloads 14523166 A Study of Variables Affecting on a Quality Assessment of Mathematics Subject in Thailand by Using Value Added Analysis on TIMSS 2011
Authors: Ruangdech Sirikit
Abstract:
The purposes of this research were to study the variables affecting the quality assessment of mathematics subject in Thailand by using value-added analysis on TIMSS 2011. The data used in this research is the secondary data from the 2011 Trends in International Mathematics and Science Study (TIMSS), collected from 6,124 students in 172 schools from Thailand, studying only mathematics subjects. The data were based on 14 assessment tests of knowledge in mathematics. There were 3 steps of data analysis: 1) To analyze descriptive statistics 2) To estimate competency of students from the assessment of their mathematics proficiency by using MULTILOG program; 3) analyze value added in the model of quality assessment using Value-Added Model with Hierarchical Linear Modeling (HLM) and 2 levels of analysis. The research results were as follows: 1. Student level variables that had significant effects on the competency of students at .01 levels were Parental care, Resources at home, Enjoyment of learning mathematics and Extrinsic motivation in learning mathematics. Variable that had significant effects on the competency of students at .05 levels were Education of parents and self-confident in learning mathematics. 2. School level variable that had significant effects on competency of students at .01 levels was Extra large school. Variable that had significant effects on competency of students at .05 levels was medium school.Keywords: quality assessment, value-added model, TIMSS, mathematics, Thailand
Procedia PDF Downloads 28623165 Modeling Average Paths Traveled by Ferry Vessels Using AIS Data
Authors: Devin Simmons
Abstract:
At the USDOT’s Bureau of Transportation Statistics, a biannual census of ferry operators in the U.S. is conducted, with results such as route mileage used to determine federal funding levels for operators. AIS data allows for the possibility of using GIS software and geographical methods to confirm operator-reported mileage for individual ferry routes. As part of the USDOT’s work on the ferry census, an algorithm was developed that uses AIS data for ferry vessels in conjunction with known ferry terminal locations to model the average route travelled for use as both a cartographic product and confirmation of operator-reported mileage. AIS data from each vessel is first analyzed to determine individual journeys based on the vessel’s velocity, and changes in velocity over time. These trips are then converted to geographic linestring objects. Using the terminal locations, the algorithm then determines whether the trip represented a known ferry route. Given a large enough dataset, routes will be represented by multiple trip linestrings, which are then filtered by DBSCAN spatial clustering to remove outliers. Finally, these remaining trips are ready to be averaged into one route. The algorithm interpolates the point on each trip linestring that represents the start point. From these start points, a centroid is calculated, and the first point of the average route is determined. Each trip is interpolated again to find the point that represents one percent of the journey’s completion, and the centroid of those points is used as the next point in the average route, and so on until 100 points have been calculated. Routes created using this algorithm have shown demonstrable improvement over previous methods, which included the implementation of a LOESS model. Additionally, the algorithm greatly reduces the amount of manual digitizing needed to visualize ferry activity.Keywords: ferry vessels, transportation, modeling, AIS data
Procedia PDF Downloads 18023164 Power Transformer Risk-Based Maintenance by Optimization of Transformer Condition and Transformer Importance
Authors: Kitti Leangkrua
Abstract:
This paper presents a risk-based maintenance strategy of a power transformer in order to optimize operating and maintenance costs. The methodology involves the study and preparation of a database for the collection the technical data and test data of a power transformer. An evaluation of the overall condition of each transformer is performed by a program developed as a result of the measured results; in addition, the calculation of the main equipment separation to the overall condition of the transformer (% HI) and the criteria for evaluating the importance (% ImI) of each location where the transformer is installed. The condition assessment is performed by analysis test data such as electrical test, insulating oil test and visual inspection. The condition of the power transformer will be classified from very poor to very good condition. The importance is evaluated from load criticality, importance of load and failure consequence. The risk matrix is developed for evaluating the risk of each power transformer. The high risk power transformer will be focused firstly. The computerized program is developed for practical use, and the maintenance strategy of a power transformer can be effectively managed.Keywords: asset management, risk-based maintenance, power transformer, health index
Procedia PDF Downloads 31123163 Transition Pay vs. Liquidity Holdings: A Comparative Analysis on Consumption Smoothing using Bank Transaction Data
Authors: Nora Neuteboom
Abstract:
This study investigates household financial behaviors during unemployment spells in the Netherlands using high-frequency transaction data through a event study specification integrating propensity score matching. In our specification, we contrasted treated individuals, who underwent job loss, with non-treated individuals possessing comparable financial characteristics. The initial onset of unemployment triggers a substantial surge in income, primarily attributed to transition payments, but swiftly drops post-unemployment, with unemployment benefits covering slightly over half of former salary earnings. Despite a re-employment rate of around half within six months, the treatment group experiences a persistent average monthly earnings reduction of approximately 600 EUR by month. Spending patterns fluctuate significantly, surging before unemployment due to transition payments and declining below non-treated individuals post-unemployment, indicating challenges to fully smooth consumption after job loss. Furthermore, our study disentangles the effects of transition payments and liquidity holdings on spending, revealing that transition payments exert a more pronounced and prolonged impact on consumption smoothing than liquidity holdings. Transition payments significantly stimulate spending, particularly in pin and iDEAL categories, contrasting a much smaller relative spending impact of liquidity holdings.Keywords: household consumption, transaction data, big data, propensity score matching
Procedia PDF Downloads 2923162 System Dietadhoc® - A Fusion of Human-Centred Design and Agile Development for the Explainability of AI Techniques Based on Nutritional and Clinical Data
Authors: Michelangelo Sofo, Giuseppe Labianca
Abstract:
In recent years, the scientific community's interest in the exploratory analysis of biomedical data has increased exponentially. Considering the field of research of nutritional biologists, the curative process, based on the analysis of clinical data, is a very delicate operation due to the fact that there are multiple solutions for the management of pathologies in the food sector (for example can recall intolerances and allergies, management of cholesterol metabolism, diabetic pathologies, arterial hypertension, up to obesity and breathing and sleep problems). In this regard, in this research work a system was created capable of evaluating various dietary regimes for specific patient pathologies. The system is founded on a mathematical-numerical model and has been created tailored for the real working needs of an expert in human nutrition using the human-centered design (ISO 9241-210), therefore it is in step with continuous scientific progress in the field and evolves through the experience of managed clinical cases (machine learning process). DietAdhoc® is a decision support system nutrition specialists for patients of both sexes (from 18 years of age) developed with an agile methodology. Its task consists in drawing up the biomedical and clinical profile of the specific patient by applying two algorithmic optimization approaches on nutritional data and a symbolic solution, obtained by transforming the relational database underlying the system into a deductive database. For all three solution approaches, particular emphasis has been given to the explainability of the suggested clinical decisions through flexible and customizable user interfaces. Furthermore, the system has multiple software modules based on time series and visual analytics techniques that allow to evaluate the complete picture of the situation and the evolution of the diet assigned for specific pathologies.Keywords: medical decision support, physiological data extraction, data driven diagnosis, human centered AI, symbiotic AI paradigm
Procedia PDF Downloads 3123161 Hybrid Collaborative-Context Based Recommendations for Civil Affairs Operations
Authors: Patrick Cummings, Laura Cassani, Deirdre Kelliher
Abstract:
In this paper we present findings from a research effort to apply a hybrid collaborative-context approach for a system focused on Marine Corps civil affairs data collection, aggregation, and analysis called the Marine Civil Information Management System (MARCIMS). The goal of this effort is to provide operators with information to make sense of the interconnectedness of entities and relationships in their area of operation and discover existing data to support civil military operations. Our approach to build a recommendation engine was designed to overcome several technical challenges, including 1) ensuring models were robust to the relatively small amount of data collected by the Marine Corps civil affairs community; 2) finding methods to recommend novel data for which there are no interactions captured; and 3) overcoming confirmation bias by ensuring content was recommended that was relevant for the mission despite being obscure or less well known. We solve this by implementing a combination of collective matrix factorization (CMF) and graph-based random walks to provide recommendations to civil military operations users. We also present a method to resolve the challenge of computation complexity inherent from highly connected nodes through a precomputed process.Keywords: Recommendation engine, collaborative filtering, context based recommendation, graph analysis, coverage, civil affairs operations, Marine Corps
Procedia PDF Downloads 12823160 An Exploratory Study on the Impact of Video-stimulated Reflection on Novice EFL Teachers’ Professional Development
Authors: Ibrahima Diallo
Abstract:
The literature on teacher education foregrounds reflection as an important aspect of professional practice. Reflection for a teacher consists in critically analysing and evaluating retrospectively a lesson to see what worked, what did not work, and how to improve it for the future. Now, many teacher education programmes worldwide consider the ability to reflect as one of the hallmarks of an effective educator. However, in some context like Senegal, reflection has not been given due consideration in teacher education programmes. In contexts where it has been in the education landscape for some time now, reflection is mostly depicted as an individual written activity and many teacher trainees have become disenchanted by the repeated enactments of this task that is solely intended to satisfy course requirements. This has resulted in whitewashing weaknesses or even ‘faking’ reflection. Besides, the “one-size-fits-all” approach of reflection could not flourish because how reflection impacts on practice is still unproven. Therefore, reflective practice needs to be contextualised and made more thought-provoking through dialogue and by using classroom data. There is also a need to highlight change brought in teachers’ practice through reflection. So, this study introduces reflection in a new context and aims to show evidenced change in novice EFL teachers’ practice through dialogic data-led reflection. The purpose of this study is also to contribute to the scarce literature on reflection in sub-Saharan Africa by bringing new perspectives on contextualised teacher-led reflection. Eight novice EFL teachers participated in this qualitative longitudinal study, and data have been gathered online through post-lesson reflection recordings and lesson videos for a period of four months. Then, the data have been thematically analysed using NVivo to systematically organize and manage the large amount of data. The analysis followed the six steps approach to thematic analysis. Major themes related to teachers’ classroom practice and their conception of reflection emerged from the analysis of the data. The results showed that post-lesson reflection with a peer can help novice EFL teachers gained more awareness on their classroom practice. Dialogic reflection also helped them evaluate their lessons and seek for improvement. The analysis of the data also gave insight on teachers’ conception of reflection in an EFL context. It was found that teachers were more engaged in reflection when using their lesson video recordings. Change in teaching behaviour as a result of reflection was evidenced by the analysis of the lesson video recordings. This study has shown that video-stimulated reflection is practical form of professional development that can be embedded in teachers’ professional life.Keywords: novice EFL teachers, practice, professional development, video-stimulated reflection
Procedia PDF Downloads 10123159 Official Game Account Analysis: Factors Influence Users' Judgments in Limited-Word Posts
Authors: Shanhua Hu
Abstract:
Social media as a critical propagandizing form of film, video games, and digital products has received substantial research attention, but there exists several critical barriers such as: (1) few studies exploring the internal and external connections of a product as part of the multimodal context that gives rise to readability and commercial return; (2) the lack of study of multimodal analysis in product’s official account of game publishers and its impact on users’ behaviors including purchase intention, social media engagement, and playing time; (3) no standardized ecologically-valid, game type-varying data can be used to study the complexity of official account’s postings within a time period. This proposed research helps to tackle these limitations in order to develop a model of readability study that is more ecologically valid, robust, and thorough. To accomplish this objective, this paper provides a more diverse dataset comprising different visual elements and messages collected from the official Twitter accounts of the Top 20 best-selling games of 2021. Video game companies target potential users through social media, a popular approach is to set up an official account to maintain exposure. Typically, major game publishers would create an official account on Twitter months before the game's release date to update on the game's development, announce collaborations, and reveal spoilers. Analyses of tweets from those official Twitter accounts would assist publishers and marketers in identifying how to efficiently and precisely deploy advertising to increase game sales. The purpose of this research is to determine how official game accounts use Twitter to attract new customers, specifically which types of messages are most effective at increasing sales. The dataset includes the number of days until the actual release date on Twitter posts, the readability of the post (Flesch Reading Ease Score, FRES), the number of emojis used, the number of hashtags, the number of followers of the mentioned users, the categorization of the posts (i.e., spoilers, collaborations, promotions), and the number of video views. The timeline of Twitter postings from official accounts will be compared to the history of pre-orders and sales figures to determine the potential impact of social media posts. This study aims to determine how the above-mentioned characteristics of official accounts' Twitter postings influence the sales of the game and to examine the possible causes of this influence. The outcome will provide researchers with a list of potential aspects that could influence people's judgments in limited-word posts. With the increased average online time, users would adapt more quickly than before in online information exchange and readings, such as the word to use sentence length, and the use of emojis or hashtags. The study on the promotion of official game accounts will not only enable publishers to create more effective promotion techniques in the future but also provide ideas for future research on the influence of social media posts with a limited number of words on consumers' purchasing decisions. Future research can focus on more specific linguistic aspects, such as precise word choice in advertising.Keywords: engagement, official account, promotion, twitter, video game
Procedia PDF Downloads 8523158 Ontology-Based Approach for Temporal Semantic Modeling of Social Networks
Authors: Souâad Boudebza, Omar Nouali, Faiçal Azouaou
Abstract:
Social networks have recently gained a growing interest on the web. Traditional formalisms for representing social networks are static and suffer from the lack of semantics. In this paper, we will show how semantic web technologies can be used to model social data. The SemTemp ontology aligns and extends existing ontologies such as FOAF, SIOC, SKOS and OWL-Time to provide a temporal and semantically rich description of social data. We also present a modeling scenario to illustrate how our ontology can be used to model social networks.Keywords: ontology, semantic web, social network, temporal modeling
Procedia PDF Downloads 39323157 Ontology-Based Backpropagation Neural Network Classification and Reasoning Strategy for NoSQL and SQL Databases
Authors: Hao-Hsiang Ku, Ching-Ho Chi
Abstract:
Big data applications have become an imperative for many fields. Many researchers have been devoted into increasing correct rates and reducing time complexities. Hence, the study designs and proposes an Ontology-based backpropagation neural network classification and reasoning strategy for NoSQL big data applications, which is called ON4NoSQL. ON4NoSQL is responsible for enhancing the performances of classifications in NoSQL and SQL databases to build up mass behavior models. Mass behavior models are made by MapReduce techniques and Hadoop distributed file system based on Hadoop service platform. The reference engine of ON4NoSQL is the ontology-based backpropagation neural network classification and reasoning strategy. Simulation results indicate that ON4NoSQL can efficiently achieve to construct a high performance environment for data storing, searching, and retrieving.Keywords: Hadoop, NoSQL, ontology, back propagation neural network, high distributed file system
Procedia PDF Downloads 26423156 Biophysically Motivated Phylogenies
Authors: Catherine Felce, Lior Pachter
Abstract:
Current methods for building phylogenetic trees from gene expression data consider mean expression levels. With single-cell technologies, we can leverage more information about cell dynamics by considering the entire distribution of gene expression across cells. Using biophysical modeling, we propose a method for constructing phylogenetic trees from scRNA-seq data, building on Felsenstein's method of continuous characters. This method can highlight genes whose level of expression may be unchanged between species, but whose rates of transcription/decay may have evolved over time.Keywords: phylogenetics, single-cell, biophysical modeling, transcription
Procedia PDF Downloads 6423155 Open Educational Resource in Online Mathematics Learning
Authors: Haohao Wang
Abstract:
Technology, multimedia in Open Educational Resources, can contribute positively to student performance in an online instructional environment. Student performance data of past four years were obtained from an online course entitled Applied Calculus (MA139). This paper examined the data to determine whether multimedia (independent variable) had any impact on student performance (dependent variable) in online math learning, and how students felt about the value of the technology. Two groups of student data were analyzed, group 1 (control) from the online applied calculus course that did not use multimedia instructional materials, and group 2 (treatment) of the same online applied calculus course that used multimedia instructional materials. For the MA139 class, results indicate a statistically significant difference (p = .001) between the two groups, where group 1 had a final score mean of 56.36 (out of 100), group 2 of 70.68. Additionally, student testimonials were discussed in which students shared their experience in learning applied calculus online with multimedia instructional materials.Keywords: online learning, open educational resources, multimedia, technology
Procedia PDF Downloads 38023154 Design and Development of Fleet Management System for Multi-Agent Autonomous Surface Vessel
Authors: Zulkifli Zainal Abidin, Ahmad Shahril Mohd Ghani
Abstract:
Agent-based systems technology has been addressed as a new paradigm for conceptualizing, designing, and implementing software systems. Agents are sophisticated systems that act autonomously across open and distributed environments in solving problems. Nevertheless, it is impractical to rely on a single agent to do all computing processes in solving complex problems. An increasing number of applications lately require multiple agents to work together. A multi-agent system (MAS) is a loosely coupled network of agents that interact to solve problems that are beyond the individual capacities or knowledge of each problem solver. However, the network of MAS still requires a main system to govern or oversees the operation of the agents in order to achieve a unified goal. We had developed a fleet management system (FMS) in order to manage the fleet of agents, plan route for the agents, perform real-time data processing and analysis, and issue sets of general and specific instructions to the agents. This FMS should be able to perform real-time data processing, communicate with the autonomous surface vehicle (ASV) agents and generate bathymetric map according to the data received from each ASV unit. The first algorithm is developed to communicate with the ASV via radio communication using standard National Marine Electronics Association (NMEA) protocol sentences. Next, the second algorithm will take care of the path planning, formation and pattern generation is tested using various sample data. Lastly, the bathymetry map generation algorithm will make use of data collected by the agents to create bathymetry map in real-time. The outcome of this research is expected can be applied on various other multi-agent systems.Keywords: autonomous surface vehicle, fleet management system, multi agent system, bathymetry
Procedia PDF Downloads 27523153 The Role of Big Data Analytics and Corporate Social Responsibility in Driving Green Innovation
Authors: Abdeslam Hassani
Abstract:
This study addresses the increasing environmental concerns faced by businesses due to regulatory and stakeholder pressures. It explores how big data analytics (BDA) and advanced technologies, particularly artificial intelligence, combined with corporate social responsibility (CSR), can foster green innovation and sustainable practices. The research builds on existing literature, highlighting the critical role of technologies and CSR in achieving sustainability goals. This research adopts a multidimensional approach, offering a more comprehensive understanding of the interplay between technologies, governance, and environmental policies. A qualitative methodology was chosen, involving a systematic literature review and semi-structured interviews with executives from Canadian companies. NVivo software will be used to analyze interview data, ensuring a rigorous approach to identifying key contextual factors. The cross-analysis of literature findings and interview insights will help validate theoretical constructs and develop a conceptual framework. This study contributes by providing both theoretical insights and practical recommendations. It offers executives actionable guidance on integrating CSR into strategic decision-making and aligning technological capabilities with sustainability objectives. This approach aims to improve firms’ competitiveness, ensure regulatory compliance, and enhance their role in promoting green innovation.Keywords: big data analytics, corporate social responsibility, green innovation, advanced technology
Procedia PDF Downloads 723152 Conceptualizing Clashing Values in the Field of Media Ethics
Authors: Saadia Izzeldin Malik
Abstract:
Lack of ethics is the crisis of the 21-century. Today’s global world is filled with economic, political, environmental, media/communication, and social crises that all generated by the eroding fabric of ethics and moral values that guide human’s decisions in all aspects of live. Our global world is guided by liberal western democratic principles and liberal capitalist economic principles that define and reinforce each other. In economic terms, capitalism has turned world economic systems into one market place of ideas and products controlled by big multinational corporations that not only determine the conditions and terms of commodity production and commodity exchange between countries, but also transform the political economy of media systems around the globe. The citizen (read the consumer) today is the target of persuasion by all types of media at a time when her/his interests should be, ethically and in principle, the basic significant factor in the selection of media content. It is very important in this juncture of clashing media values –professional and commercial- and wide spread ethical lapses of media organizations and media professionals to think of a perspective to theorize these conflicting values within a broader framework of media ethics. Thus, the aim of this paper is to, epistemologically, bring to the center a perspective on media ethics as a basis for reconciliation of clashing values of the media. The paper focuses on conflicting ethical values in current media debate; namely ownership of media vs. press freedom, individual right for privacy vs. public right to know, and global western consumerism values vs. media values. The paper concludes that a framework to reconcile conflicting values of media ethics should focus on the “individual” journalist and his/her moral development as well as focus on maintaining ethical principles of the media as an institution with a primary social responsibility for the “public” it serves.Keywords: ethics, media, journalism, social responsibility, conflicting values, global
Procedia PDF Downloads 49923151 An AI-Based Dynamical Resource Allocation Calculation Algorithm for Unmanned Aerial Vehicle
Authors: Zhou Luchen, Wu Yubing, Burra Venkata Durga Kumar
Abstract:
As the scale of the network becomes larger and more complex than before, the density of user devices is also increasing. The development of Unmanned Aerial Vehicle (UAV) networks is able to collect and transform data in an efficient way by using software-defined networks (SDN) technology. This paper proposed a three-layer distributed and dynamic cluster architecture to manage UAVs by using an AI-based resource allocation calculation algorithm to address the overloading network problem. Through separating services of each UAV, the UAV hierarchical cluster system performs the main function of reducing the network load and transferring user requests, with three sub-tasks including data collection, communication channel organization, and data relaying. In this cluster, a head node and a vice head node UAV are selected considering the Central Processing Unit (CPU), operational (RAM), and permanent (ROM) memory of devices, battery charge, and capacity. The vice head node acts as a backup that stores all the data in the head node. The k-means clustering algorithm is used in order to detect high load regions and form the UAV layered clusters. The whole process of detecting high load areas, forming and selecting UAV clusters, and moving the selected UAV cluster to that area is proposed as offloading traffic algorithm.Keywords: k-means, resource allocation, SDN, UAV network, unmanned aerial vehicles
Procedia PDF Downloads 12023150 Deep learning with Noisy Labels : Learning True Labels as Discrete Latent Variable
Authors: Azeddine El-Hassouny, Chandrashekhar Meshram, Geraldin Nanfack
Abstract:
In recent years, learning from data with noisy labels (Label Noise) has been a major concern in supervised learning. This problem has become even more worrying in Deep Learning, where the generalization capabilities have been questioned lately. Indeed, deep learning requires a large amount of data that is generally collected by search engines, which frequently return data with unreliable labels. In this paper, we investigate the Label Noise in Deep Learning using variational inference. Our contributions are : (1) exploiting Label Noise concept where the true labels are learnt using reparameterization variational inference, while observed labels are learnt discriminatively. (2) the noise transition matrix is learnt during the training without any particular process, neither heuristic nor preliminary phases. The theoretical results shows how true label distribution can be learned by variational inference in any discriminate neural network, and the effectiveness of our approach is proved in several target datasets, such as MNIST and CIFAR32.Keywords: label noise, deep learning, discrete latent variable, variational inference, MNIST, CIFAR32
Procedia PDF Downloads 13223149 Multimodal Database of Retina Images for Africa: The First Open Access Digital Repository for Retina Images in Sub Saharan Africa
Authors: Simon Arunga, Teddy Kwaga, Rita Kageni, Michael Gichangi, Nyawira Mwangi, Fred Kagwa, Rogers Mwavu, Amos Baryashaba, Luis F. Nakayama, Katharine Morley, Michael Morley, Leo A. Celi, Jessica Haberer, Celestino Obua
Abstract:
Purpose: The main aim for creating the Multimodal Database of Retinal Images for Africa (MoDRIA) was to provide a publicly available repository of retinal images for responsible researchers to conduct algorithm development in a bid to curb the challenges of ophthalmic artificial intelligence (AI) in Africa. Methods: Data and retina images were ethically sourced from sites in Uganda and Kenya. Data on medical history, visual acuity, ocular examination, blood pressure, and blood sugar were collected. Retina images were captured using fundus cameras (Foru3-nethra and Canon CR-Mark-1). Images were stored on a secure online database. Results: The database consists of 7,859 retinal images in portable network graphics format from 1,988 participants. Images from patients with human immunodeficiency virus were 18.9%, 18.2% of images were from hypertensive patients, 12.8% from diabetic patients, and the rest from normal’ participants. Conclusion: Publicly available data repositories are a valuable asset in the development of AI technology. Therefore, is a need for the expansion of MoDRIA so as to provide larger datasets that are more representative of Sub-Saharan data.Keywords: retina images, MoDRIA, image repository, African database
Procedia PDF Downloads 13623148 Assessment of Environmental Quality of an Urban Setting
Authors: Namrata Khatri
Abstract:
The rapid growth of cities is transforming the urban environment and posing significant challenges for environmental quality. This study examines the urban environment of Belagavi in Karnataka, India, using geostatistical methods to assess the spatial pattern and land use distribution of the city and to evaluate the quality of the urban environment. The study is driven by the necessity to assess the environmental impact of urbanisation. Satellite data was utilised to derive information on land use and land cover. The investigation revealed that land use had changed significantly over time, with a drop in plant cover and an increase in built-up areas. High-resolution satellite data was also utilised to map the city's open areas and gardens. GIS-based research was used to assess public green space accessibility and to identify regions with inadequate waste management practises. The findings revealed that garbage collection and disposal techniques in specific areas of the city needed to be improved. Moreover, the study evaluated the city's thermal environment using Landsat 8 land surface temperature (LST) data. The investigation found that built-up regions had higher LST values than green areas, pointing to the city's urban heat island (UHI) impact. The study's conclusions have far-reaching ramifications for urban planners and politicians in Belgaum and other similar cities. The findings may be utilised to create sustainable urban planning strategies that address the environmental effect of urbanisation while also improving the quality of life for city dwellers. Satellite data and high-resolution satellite pictures were gathered for the study, and remote sensing and GIS tools were utilised to process and analyse the data. Ground truthing surveys were also carried out to confirm the accuracy of the remote sensing and GIS-based data. Overall, this study provides a complete assessment of Belgaum's environmental quality and emphasizes the potential of remote sensing and geographic information systems (GIS) approaches in environmental assessment and management.Keywords: environmental quality, UEQ, remote sensing, GIS
Procedia PDF Downloads 8423147 Parameter Estimation of Gumbel Distribution with Maximum-Likelihood Based on Broyden Fletcher Goldfarb Shanno Quasi-Newton
Authors: Dewi Retno Sari Saputro, Purnami Widyaningsih, Hendrika Handayani
Abstract:
Extreme data on an observation can occur due to unusual circumstances in the observation. The data can provide important information that can’t be provided by other data so that its existence needs to be further investigated. The method for obtaining extreme data is one of them using maxima block method. The distribution of extreme data sets taken with the maxima block method is called the distribution of extreme values. Distribution of extreme values is Gumbel distribution with two parameters. The parameter estimation of Gumbel distribution with maximum likelihood method (ML) is difficult to determine its exact value so that it is necessary to solve the approach. The purpose of this study was to determine the parameter estimation of Gumbel distribution with quasi-Newton BFGS method. The quasi-Newton BFGS method is a numerical method used for nonlinear function optimization without constraint so that the method can be used for parameter estimation from Gumbel distribution whose distribution function is in the form of exponential doubel function. The quasi-New BFGS method is a development of the Newton method. The Newton method uses the second derivative to calculate the parameter value changes on each iteration. Newton's method is then modified with the addition of a step length to provide a guarantee of convergence when the second derivative requires complex calculations. In the quasi-Newton BFGS method, Newton's method is modified by updating both derivatives on each iteration. The parameter estimation of the Gumbel distribution by a numerical approach using the quasi-Newton BFGS method is done by calculating the parameter values that make the distribution function maximum. In this method, we need gradient vector and hessian matrix. This research is a theory research and application by studying several journals and textbooks. The results of this study obtained the quasi-Newton BFGS algorithm and estimation of Gumbel distribution parameters. The estimation method is then applied to daily rainfall data in Purworejo District to estimate the distribution parameters. This indicates that the high rainfall that occurred in Purworejo District decreased its intensity and the range of rainfall that occurred decreased.Keywords: parameter estimation, Gumbel distribution, maximum likelihood, broyden fletcher goldfarb shanno (BFGS)quasi newton
Procedia PDF Downloads 33123146 U.S. Trade and Trade Balance with China: Testing for Marshall-Lerner Condition and the J-Curve Hypothesis
Authors: Anisul Islam
Abstract:
The U.S. has a very strong trade relationship with China but with a large and persistent trade deficit. Some has argued that the undervalued Chinese Yuan is to be blamed for the persistent trade deficit. The empirical results are mixed at best. This paper empirically estimates the U.S. export function along with the U.S. import function with its trade with China with the purpose of testing for the existence of the Marshall-Lerner (ML) condition as well for the possible existence of the J-curve hypothesis. Annual export and import data will be utilized for as long as the time series data exists. The export and import functions will be estimated using advanced econometric techniques, along with appropriate diagnostic tests performed to examine the validity and reliability of the estimated results. The annual time-series data covers from 1975 to 2022 with a sample size of 48 years, the longest period ever utilized before in any previous study. The data is collected from several sources, such as the World Bank’s World Development Indicators, IMF Financial Statistics, IMF Direction of Trade Statistics, and several other sources. The paper is expected to shed important light on the ongoing debate regarding the persistent U.S. trade deficit with China and the policies that may be useful to reduce such deficits over time. As such, the paper will be of great interest for the academics, researchers, think tanks, global organizations, and policy makers in both China and the U.S.Keywords: exports, imports, marshall-lerner condition, j-curve hypothesis, united states, china
Procedia PDF Downloads 70