Search results for: language error
1242 Expanding Trading Strategies By Studying Sentiment Correlation With Data Mining Techniques
Authors: Ved Kulkarni, Karthik Kini
Abstract:
This experiment aims to understand how the media affects the power markets in the mainland United States and study the duration of reaction time between news updates and actual price movements. it have taken into account electric utility companies trading in the NYSE and excluded companies that are more politically involved and move with higher sensitivity to Politics. The scrapper checks for any news related to keywords, which are predefined and stored for each specific company. Based on this, the classifier will allocate the effect into five categories: positive, negative, highly optimistic, highly negative, or neutral. The effect on the respective price movement will be studied to understand the response time. Based on the response time observed, neural networks would be trained to understand and react to changing market conditions, achieving the best strategy in every market. The stock trader would be day trading in the first phase and making option strategy predictions based on the black holes model. The expected result is to create an AI-based system that adjusts trading strategies within the market response time to each price movement.Keywords: data mining, language processing, artificial neural networks, sentiment analysis
Procedia PDF Downloads 241241 On Driving Forces of Cultural Globalization and its Retroaction: Under the Guidance of Skopos Theory
Authors: Zhai Yujia
Abstract:
None of the scholars and researchers has ever stepped into this field, though there are quite a few papers worked on various topics relevant to cultural and economic globalization separately. Economic globalization is earlier than cultural globalization. Since the invention of currency, people have had the sense of making money for the purpose of living, supporting their families, or other personal reasons. Their strong desire for earning a living is one of the incentives to propel the trade, tourism and other related economic activities that provide the service within the homeland at first and expand into the whole world later, as the global markets grow and mature. The need for operation impels international communication or interaction. To achieve this, it is vital to realize or recognize other cultures to some degree, concluding language, customs, social etiquette and history of different nations. All this drives the cultural globalization process. In contrast, it is clear that the development of cultural globalization does accelerate the process of economic globalization in return. Under the guidance of Skopos theory (first proposed by Hans Vermeer, and its core principle is that the translation process is determined by the purpose), this paper aims to demonstrate that cultural globalization is not a process in isolation by analyzing its driving forces and retroaction thoroughly with an approach of overview. It intertwines with economic globalization. The two push each other to proper gradually during their development, serving as the indispensable parts of the globalization process.Keywords: cultural globalization, driving forces, retroaction, Skopos theory
Procedia PDF Downloads 1651240 Hybrid Algorithm for Non-Negative Matrix Factorization Based on Symmetric Kullback-Leibler Divergence for Signal Dependent Noise: A Case Study
Authors: Ana Serafimovic, Karthik Devarajan
Abstract:
Non-negative matrix factorization approximates a high dimensional non-negative matrix V as the product of two non-negative matrices, W and H, and allows only additive linear combinations of data, enabling it to learn parts with representations in reality. It has been successfully applied in the analysis and interpretation of high dimensional data arising in neuroscience, computational biology, and natural language processing, to name a few. The objective of this paper is to assess a hybrid algorithm for non-negative matrix factorization with multiplicative updates. The method aims to minimize the symmetric version of Kullback-Leibler divergence known as intrinsic information and assumes that the noise is signal-dependent and that it originates from an arbitrary distribution from the exponential family. It is a generalization of currently available algorithms for Gaussian, Poisson, gamma and inverse Gaussian noise. We demonstrate the potential usefulness of the new generalized algorithm by comparing its performance to the baseline methods which also aim to minimize symmetric divergence measures.Keywords: non-negative matrix factorization, dimension reduction, clustering, intrinsic information, symmetric information divergence, signal-dependent noise, exponential family, generalized Kullback-Leibler divergence, dual divergence
Procedia PDF Downloads 2481239 Domain Driven Design vs Soft Domain Driven Design Frameworks
Authors: Mohammed Salahat, Steve Wade
Abstract:
This paper presents and compares the SSDDD “Systematic Soft Domain Driven Design Framework” to DDD “Domain Driven Design Framework” as a soft system approach of information systems development. The framework use SSM as a guiding methodology within which we have embedded a sequence of design tasks based on the UML leading to the implementation of a software system using the Naked Objects framework. This framework has been used in action research projects that have involved the investigation and modelling of business processes using object-oriented domain models and the implementation of software systems based on those domain models. Within this framework, Soft Systems Methodology (SSM) is used as a guiding methodology to explore the problem situation and to develop the domain model using UML for the given business domain. The framework is proposed and evaluated in our previous works, a comparison between SSDDD and DDD is presented in this paper, to show how SSDDD improved DDD as an approach to modelling and implementing business domain perspectives for Information Systems Development. The comparison process, the results, and the improvements are presented in the following sections of this paper.Keywords: domain-driven design, soft domain-driven design, naked objects, soft language
Procedia PDF Downloads 3001238 A Fourier Method for Risk Quantification and Allocation of Credit Portfolios
Authors: Xiaoyu Shen, Fang Fang, Chujun Qiu
Abstract:
Herewith we present a Fourier method for credit risk quantification and allocation in the factor-copula model framework. The key insight is that, compared to directly computing the cumulative distribution function of the portfolio loss via Monte Carlo simulation, it is, in fact, more efficient to calculate the transformation of the distribution function in the Fourier domain instead and inverting back to the real domain can be done in just one step and semi-analytically, thanks to the popular COS method (with some adjustments). We also show that the Euler risk allocation problem can be solved in the same way since it can be transformed into the problem of evaluating a conditional cumulative distribution function. Once the conditional or unconditional cumulative distribution function is known, one can easily calculate various risk metrics. The proposed method not only fills the niche in literature, to the best of our knowledge, of accurate numerical methods for risk allocation but may also serve as a much faster alternative to the Monte Carlo simulation method for risk quantification in general. It can cope with various factor-copula model choices, which we demonstrate via examples of a two-factor Gaussian copula and a two-factor Gaussian-t hybrid copula. The fast error convergence is proved mathematically and then verified by numerical experiments, in which Value-at-Risk, Expected Shortfall, and conditional Expected Shortfall are taken as examples of commonly used risk metrics. The calculation speed and accuracy are tested to be significantly superior to the MC simulation for real-sized portfolios. The computational complexity is, by design, primarily driven by the number of factors instead of the number of obligors, as in the case of Monte Carlo simulation. The limitation of this method lies in the "curse of dimension" that is intrinsic to multi-dimensional numerical integration, which, however, can be relaxed with the help of dimension reduction techniques and/or parallel computing, as we will demonstrate in a separate paper. The potential application of this method has a wide range: from credit derivatives pricing to economic capital calculation of the banking book, default risk charge and incremental risk charge computation of the trading book, and even to other risk types than credit risk.Keywords: credit portfolio, risk allocation, factor copula model, the COS method, Fourier method
Procedia PDF Downloads 1711237 Advances in Artificial intelligence Using Speech Recognition
Authors: Khaled M. Alhawiti
Abstract:
This research study aims to present a retrospective study about speech recognition systems and artificial intelligence. Speech recognition has become one of the widely used technologies, as it offers great opportunity to interact and communicate with automated machines. Precisely, it can be affirmed that speech recognition facilitates its users and helps them to perform their daily routine tasks, in a more convenient and effective manner. This research intends to present the illustration of recent technological advancements, which are associated with artificial intelligence. Recent researches have revealed the fact that speech recognition is found to be the utmost issue, which affects the decoding of speech. In order to overcome these issues, different statistical models were developed by the researchers. Some of the most prominent statistical models include acoustic model (AM), language model (LM), lexicon model, and hidden Markov models (HMM). The research will help in understanding all of these statistical models of speech recognition. Researchers have also formulated different decoding methods, which are being utilized for realistic decoding tasks and constrained artificial languages. These decoding methods include pattern recognition, acoustic phonetic, and artificial intelligence. It has been recognized that artificial intelligence is the most efficient and reliable methods, which are being used in speech recognition.Keywords: speech recognition, acoustic phonetic, artificial intelligence, hidden markov models (HMM), statistical models of speech recognition, human machine performance
Procedia PDF Downloads 4801236 The Relation between Vitamin C and Oral Health
Authors: Mai Ashraf Talaat
Abstract:
Background: Vitamin C (ascorbic acid) is an essential nutrient for the development and repair of all body tissues. It can be obtained from a healthy diet or through supplementation. Due to its importance, vitamin C has become a mainstay in the treatment and prevention of many diseases and in maintaining immune, skin, bone and overall health. This review article aims to discuss the studies and case reports conducted to evaluate the effect of Vitamin C on oral health and the recent advances in oral medicine that involve the use of vitamin C. Data/Sources: The review was conducted for clinical studies, case reports and published literature in the English language that addresses this topic. An extensive search in the electronic databases of PubMed, PubMed Central, Web of Science, National Library of Medicine and ResearchGate was performed. Conclusion: Vitamin C is thought to treat periodontal diseases and gingival enlargement. It also affects biofilm formation and therefore, it helps in reducing caries incidence. Recently, vitamin C mesotherapy has been used to treat inflamed gingiva, bleeding gums and gingival hyperpigmentation. More research and randomized controlled trials are needed on this specific topic for more accurate judgment. Clinical significance: A minimally invasive approach - the usage of vitamin C in dental care could drastically reduce the need for surgical intervention.Keywords: oral health, periodontology, vitamin C, Gingivitis
Procedia PDF Downloads 811235 Empowering Business Students with Intercultural Communicative Competence through Multicultural Literature
Authors: Dorsaf Ben Malek
Abstract:
The function of culture in language teaching changed because of globalization and the latest technologies. English became a lingua franca which resulted in altering the teaching objectives. The re-evaluation of cultural awareness is one of them. Business English teaching has also been subject to all these changes. It is therefore a wrong idea if we try to consider it as a diffusion of unlimited listing of lexis, diagrams, charts, and statistics. In fact, business students’ future career will require business terminology together with intercultural communicative competence (ICC) to handle different multicultural encounters and contribute to the international community. The first part of this paper is dedicated to the necessity of empowering business students with intercultural communicative competence and the second turns around the potential of multicultural literature in implementing ICC in business English teaching. This was proved through a qualitative action research done on a group of Tunisian MA business students. It was an opportunity to discover the potential of multicultural literature together with inquiry-based learning in enhancing business students’ intercultural communicative competence. Data were collected through classroom observations, journals and semi-structured interviews. Results were in favour of using multicultural literature to enhance business students’ ICC. In addition, the short story may be a motivating tool to read literature, and inquiry-based learning can be an effective approach to teaching literature.Keywords: intercultural communicative competence, multicultural literature, short stories, inquiry-based learning
Procedia PDF Downloads 3371234 Diversability and Diversity: Toward Including Disability/Body-Mind Diversity in Educational Diversity, Equity, and Inclusion
Authors: Jennifer Natalya Fink
Abstract:
Since the racial reckoning of 2020, almost every major educational institution has incorporated diversity, equity, and inclusion (DEI) principles into its administrative, hiring, and pedagogical practices. Yet these DEI principles rarely incorporate explicit language or critical thinking about disability. Despite the fact that according to the World Health Organization, one in five people worldwide is disabled, making disabled people the larger minority group in the world, disability remains the neglected stepchild of DEI. Drawing on disability studies and crip theory frameworks, the underlying causes of this exclusion of disability from DEI, such as stigma, shame, invisible disabilities, institutionalization/segregation/delineation from family, and competing models and definitions of disability are examined. This paper explores both the ideological and practical shifts necessary to include disability in university DEI initiatives. It offers positive examples as well as conceptual frameworks such as 'divers ability' for so doing. Using Georgetown University’s 2020-2022 DEI initiatives as a case study, this paper describes how curricular infusion, accessibility, identity, community, and diversity administration infused one university’s DEI initiatives with concrete disability-inclusive measures. It concludes with a consideration of how the very framework of DEI itself might be challenged and transformed if disability were to be included.Keywords: diversity, equity, inclusion, disability, crip theory, accessibility
Procedia PDF Downloads 1381233 Ontology Mapping with R-GNN for IT Infrastructure: Enhancing Ontology Construction and Knowledge Graph Expansion
Authors: Andrey Khalov
Abstract:
The rapid growth of unstructured data necessitates advanced methods for transforming raw information into structured knowledge, particularly in domain-specific contexts such as IT service management and outsourcing. This paper presents a methodology for automatically constructing domain ontologies using the DOLCE framework as the base ontology. The research focuses on expanding ITIL-based ontologies by integrating concepts from ITSMO, followed by the extraction of entities and relationships from domain-specific texts through transformers and statistical methods like formal concept analysis (FCA). In particular, this work introduces an R-GNN-based approach for ontology mapping, enabling more efficient entity extraction and ontology alignment with existing knowledge bases. Additionally, the research explores transfer learning techniques using pre-trained transformer models (e.g., DeBERTa-v3-large) fine-tuned on synthetic datasets generated via large language models such as LLaMA. The resulting ontology, termed IT Ontology (ITO), is evaluated against existing methodologies, highlighting significant improvements in precision and recall. This study advances the field of ontology engineering by automating the extraction, expansion, and refinement of ontologies tailored to the IT domain, thus bridging the gap between unstructured data and actionable knowledge.Keywords: ontology mapping, knowledge graphs, R-GNN, ITIL, NER
Procedia PDF Downloads 261232 A Regression Model for Predicting Sugar Crystal Size in a Fed-Batch Vacuum Evaporative Crystallizer
Authors: Sunday B. Alabi, Edikan P. Felix, Aniediong M. Umo
Abstract:
Crystal size distribution is of great importance in the sugar factories. It determines the market value of granulated sugar and also influences the cost of production of sugar crystals. Typically, sugar is produced using fed-batch vacuum evaporative crystallizer. The crystallization quality is examined by crystal size distribution at the end of the process which is quantified by two parameters: the average crystal size of the distribution in the mean aperture (MA) and the width of the distribution of the coefficient of variation (CV). Lack of real-time measurement of the sugar crystal size hinders its feedback control and eventual optimisation of the crystallization process. An attractive alternative is to use a soft sensor (model-based method) for online estimation of the sugar crystal size. Unfortunately, the available models for sugar crystallization process are not suitable as they do not contain variables that can be measured easily online. The main contribution of this paper is the development of a regression model for estimating the sugar crystal size as a function of input variables which are easy to measure online. This has the potential to provide real-time estimates of crystal size for its effective feedback control. Using 7 input variables namely: initial crystal size (Lo), temperature (T), vacuum pressure (P), feed flowrate (Ff), steam flowrate (Fs), initial super-saturation (S0) and crystallization time (t), preliminary studies were carried out using Minitab 14 statistical software. Based on the existing sugar crystallizer models, and the typical ranges of these 7 input variables, 128 datasets were obtained from a 2-level factorial experimental design. These datasets were used to obtain a simple but online-implementable 6-input crystal size model. It seems the initial crystal size (Lₒ) does not play a significant role. The goodness of the resulting regression model was evaluated. The coefficient of determination, R² was obtained as 0.994, and the maximum absolute relative error (MARE) was obtained as 4.6%. The high R² (~1.0) and the reasonably low MARE values are an indication that the model is able to predict sugar crystal size accurately as a function of the 6 easy-to-measure online variables. Thus, the model can be used as a soft sensor to provide real-time estimates of sugar crystal size during sugar crystallization process in a fed-batch vacuum evaporative crystallizer.Keywords: crystal size, regression model, soft sensor, sugar, vacuum evaporative crystallizer
Procedia PDF Downloads 2121231 A Heteroskedasticity Robust Test for Contemporaneous Correlation in Dynamic Panel Data Models
Authors: Andreea Halunga, Chris D. Orme, Takashi Yamagata
Abstract:
This paper proposes a heteroskedasticity-robust Breusch-Pagan test of the null hypothesis of zero cross-section (or contemporaneous) correlation in linear panel-data models, without necessarily assuming independence of the cross-sections. The procedure allows for either fixed, strictly exogenous and/or lagged dependent regressor variables, as well as quite general forms of both non-normality and heteroskedasticity in the error distribution. The asymptotic validity of the test procedure is predicated on the number of time series observations, T, being large relative to the number of cross-section units, N, in that: (i) either N is fixed as T→∞; or, (ii) N²/T→0, as both T and N diverge, jointly, to infinity. Given this, it is not expected that asymptotic theory would provide an adequate guide to finite sample performance when T/N is "small". Because of this, we also propose and establish asymptotic validity of, a number of wild bootstrap schemes designed to provide improved inference when T/N is small. Across a variety of experimental designs, a Monte Carlo study suggests that the predictions from asymptotic theory do, in fact, provide a good guide to the finite sample behaviour of the test when T is large relative to N. However, when T and N are of similar orders of magnitude, discrepancies between the nominal and empirical significance levels occur as predicted by the first-order asymptotic analysis. On the other hand, for all the experimental designs, the proposed wild bootstrap approximations do improve agreement between nominal and empirical significance levels, when T/N is small, with a recursive-design wild bootstrap scheme performing best, in general, and providing quite close agreement between the nominal and empirical significance levels of the test even when T and N are of similar size. Moreover, in comparison with the wild bootstrap "version" of the original Breusch-Pagan test our experiments indicate that the corresponding version of the heteroskedasticity-robust Breusch-Pagan test appears reliable. As an illustration, the proposed tests are applied to a dynamic growth model for a panel of 20 OECD countries.Keywords: cross-section correlation, time-series heteroskedasticity, dynamic panel data, heteroskedasticity robust Breusch-Pagan test
Procedia PDF Downloads 4361230 Genre Analysis of Postgraduate Theses and Dissertations: Case of Statement of the Problem
Authors: H. Mashhady, H. A. Manzoori, M. Doosti, M. Fatollahi
Abstract:
This study reports a descriptive research in the form of a genre analysis of postgraduates' theses and dissertations at three Iranian universities, including Ferdowsi, Tehran, and Tarbiat Moddares universities. The researchers sought to depict the generic structure of “statement of the problem” section of PhD dissertations and MA theses. Moreover, researchers desired to find any probable variety based on the year the dissertations belonged, to see weather genre-consciousness developed among Iranian postgraduates. To obtain data, “statement of the problem” section of 90 Ph.D. dissertations and MA theses from 2001 to 2013 in Teaching English as a Foreign Language (TEFL) at above-mentioned universities was selected. Frequency counts was employed for the quantitative method of data analysis, while genre analysis was used as the qualitative method. Inter-rater reliability was found to be about 0.93. Results revealed that students in different degrees at each of these universities used various generic structures for writing “statement of the problem”. Moreover, comparison of different time periods (2001-2006, and 2007-2013) revealed that postgraduates in the second time period, regardless of their degree and university, employed more similar generic structures which can be optimistically attributed to a general raise in genre awareness.Keywords: genre, genre analysis, Ph.D. and MA dissertations, statement of the problem, generic structure
Procedia PDF Downloads 6701229 Development of a Feedback Control System for a Lab-Scale Biomass Combustion System Using Programmable Logic Controller
Authors: Samuel O. Alamu, Seong W. Lee, Blaise Kalmia, Marc J. Louise Caballes, Xuejun Qian
Abstract:
The application of combustion technologies for thermal conversion of biomass and solid wastes to energy has been a major solution to the effective handling of wastes over a long period of time. Lab-scale biomass combustion systems have been observed to be economically viable and socially acceptable, but major concerns are the environmental impacts of the process and deviation of temperature distribution within the combustion chamber. Both high and low combustion chamber temperature may affect the overall combustion efficiency and gaseous emissions. Therefore, there is an urgent need to develop a control system which measures the deviations of chamber temperature from set target values, sends these deviations (which generates disturbances in the system) in the form of feedback signal (as input), and control operating conditions for correcting the errors. In this research study, major components of the feedback control system were determined, assembled, and tested. In addition, control algorithms were developed to actuate operating conditions (e.g., air velocity, fuel feeding rate) using ladder logic functions embedded in the Programmable Logic Controller (PLC). The developed control algorithm having chamber temperature as a feedback signal is integrated into the lab-scale swirling fluidized bed combustor (SFBC) to investigate the temperature distribution at different heights of the combustion chamber based on various operating conditions. The air blower rates and the fuel feeding rates obtained from automatic control operations were correlated with manual inputs. There was no observable difference in the correlated results, thus indicating that the written PLC program functions were adequate in designing the experimental study of the lab-scale SFBC. The experimental results were analyzed to study the effect of air velocity operating at 222-273 ft/min and fuel feeding rate of 60-90 rpm on the chamber temperature. The developed temperature-based feedback control system was shown to be adequate in controlling the airflow and the fuel feeding rate for the overall biomass combustion process as it helps to minimize the steady-state error.Keywords: air flow, biomass combustion, feedback control signal, fuel feeding, ladder logic, programmable logic controller, temperature
Procedia PDF Downloads 1321228 AI and the Future of Misinformation: Opportunities and Challenges
Authors: Noor Azwa Azreen Binti Abd. Aziz, Muhamad Zaim Bin Mohd Rozi
Abstract:
Moving towards the 4th Industrial Revolution, artificial intelligence (AI) is now more popular than ever. This subject is gaining significance every day and is continually expanding, often merging with other fields. Instead of merely being passive observers, there are benefits to understanding modern technology by delving into its inner workings. However, in a world teeming with digital information, the impact of AI on the spread of disinformation has garnered significant attention. The dissemination of inaccurate or misleading information is referred to as misinformation, posing a serious threat to democratic society, public debate, and individual decision-making. This article delves deep into the connection between AI and the dissemination of false information, exploring its potential, risks, and ethical issues as AI technology advances. The rise of AI has ushered in a new era in the dissemination of misinformation as AI-driven technologies are increasingly responsible for curating, recommending, and amplifying information on online platforms. While AI holds the potential to enhance the detection and mitigation of misinformation through natural language processing and machine learning, it also raises concerns about the amplification and propagation of false information. AI-powered deepfake technology, for instance, can generate hyper-realistic videos and audio recordings, making it increasingly challenging to discern fact from fiction.Keywords: artificial intelligence, digital information, disinformation, ethical issues, misinformation
Procedia PDF Downloads 981227 Correlation between Funding and Publications: A Pre-Step towards Future Research Prediction
Authors: Ning Kang, Marius Doornenbal
Abstract:
Funding is a very important – if not crucial – resource for research projects. Usually, funding organizations will publish a description of the funded research to describe the scope of the funding award. Logically, we would expect research outcomes to align with this funding award. For that reason, we might be able to predict future research topics based on present funding award data. That said, it remains to be shown if and how future research topics can be predicted by using the funding information. In this paper, we extract funding project information and their generated paper abstracts from the Gateway to Research database as a group, and use the papers from the same domains and publication years in the Scopus database as a baseline comparison group. We annotate both the project awards and the papers resulting from the funded projects with linguistic features (noun phrases), and then calculate tf-idf and cosine similarity between these two set of features. We show that the cosine similarity between the project-generated papers group is bigger than the project-baseline group, and also that these two groups of similarities are significantly different. Based on this result, we conclude that the funding information actually correlates with the content of future research output for the funded project on the topical level. How funding really changes the course of science or of scientific careers remains an elusive question.Keywords: natural language processing, noun phrase, tf-idf, cosine similarity
Procedia PDF Downloads 2481226 Trajectory Tracking of Fixed-Wing Unmanned Aerial Vehicle Using Fuzzy-Based Sliding Mode Controller
Authors: Feleke Tsegaye
Abstract:
The work in this thesis mainly focuses on trajectory tracking of fixed wing unmanned aerial vehicle (FWUAV) by using fuzzy based sliding mode controller(FSMC) for surveillance applications. Unmanned Aerial Vehicles (UAVs) are general-purpose aircraft built to fly autonomously. This technology is applied in a variety of sectors, including the military, to improve defense, surveillance, and logistics. The model of FWUAV is complex due to its high non-linearity and coupling effect. In this thesis, input decoupling is done through extracting the dominant inputs during the design of the controller and considering the remaining inputs as uncertainty. The proper and steady flight maneuvering of UAVs under uncertain and unstable circumstances is the most critical problem for researchers studying UAVs. A FSMC technique was suggested to tackle the complexity of FWUAV systems. The trajectory tracking control algorithm primarily uses the sliding-mode (SM) variable structure control method to address the system’s control issue. In the SM control, a fuzzy logic control(FLC) algorithm is utilized in place of the discontinuous phase of the SM controller to reduce the chattering impact. In the reaching and sliding stages of SM control, Lyapunov theory is used to assure finite-time convergence. A comparison between the conventional SM controller and the suggested controller is done in relation to the chattering effect as well as tracking performance. It is evident that the chattering is effectively reduced, the suggested controller provides a quick response with a minimum steady-state error, and the controller is robust in the face of unknown disturbances. The designed control strategy is simulated with the nonlinear model of FWUAV using the MATLAB® / Simulink® environments. The simulation result shows the suggested controller operates effectively, maintains an aircraft’s stability, and will hold the aircraft’s targeted flight path despite the presence of uncertainty and disturbances.Keywords: fixed-wing UAVs, sliding mode controller, fuzzy logic controller, chattering, coupling effect, surveillance, finite-time convergence, Lyapunov theory, flight path
Procedia PDF Downloads 611225 Intercultural Education through Literature Reception: An in-Depth Study of the Cultural and Literary Relations of Romania and China during 1948-2018
Authors: Iulia Elena Gîță
Abstract:
According to the sociological theory of literature, constraints on the creation and share of cultural works can be placed between two extremes: one with a high level of politicization and the other with a high level of commercialization. The overall objective of the present research is to follow the principles of Sociology of Translation to closely map and analyse the publishing activity of Romania concerning China and Chinese literature during four stages of Romanian history between 1948-2018. This paper proposes, thus, an extended approach to literature, to its cultural, political and economic reception. In achieving the proposed objectives, the research expands far beyond the literary text itself, to its macro context, analysing, through quantitative research methods, a statistical database created based on two phases - the first part containing literary and non-fictional works that address and discuss issues related to China; the second part includes literary translations of Chinese literature into Romanian, either by direct translation or by an intermediate language. Throughout this paper we will map not only the number of works, but also the topics approached by writers along the two periods of the political life of Romania.Keywords: bilateral relations, Chinese literature, intercultural understanding, international relations, socio-cultural reception, socio-political constraints, publishing
Procedia PDF Downloads 1381224 A Contrastive Rhetoric Study: The Use of Textual and Interpersonal Metadiscoursal Markers in Persian and English Newspaper Editorials
Authors: Habibollah Mashhady, Moslem Fatollahi
Abstract:
This study tries to contrast the use of metadiscoursal markers in English and Persian Newspaper Editorials as persuasive text types. These markers are linguistic elements in the text which do not add to the propositional content of it, rather they serve to realize the Halliday’s (1985) textual and interpersonal functions of language. At first, some of the most common markers from five subcategories of Text Connectives, Illocution Markers, Hedges, Emphatics, and Attitude Markers were identified in both English and Persian newspapers. Then, the frequency of occurrence of these markers in both English and Persian corpus consisting of 44 randomly selected editorials (18,000 words in each) from several English and Persian newspapers was recorded. After that, using a two-way chi square analysis, the overall x2 obs was found to be highly significant. So, the null hypothesis of no difference was confidently rejected. Finally, in order to determine the contribution of each subcategory to the overall x 2 value, one-way chi square analyses were applied to the individual subcategories. The results indicated that only two of the five subcategories of markers were statistically significant. This difference is then attributed to the differing spirits prevailing in the linguistic communities involved. Regarding the minor research question it was found that, in contrast to English writers, Persian writers are more writer-oriented in their writings.Keywords: metadiscoursal markers, textual meta-function, interpersonal meta-function, persuasive texts, English and Persian newspaper editorials
Procedia PDF Downloads 5761223 Polypropylene Matrix Enriched With Silver Nanoparticles From Banana Peel Extract For Antimicrobial Control Of E. coli and S. epidermidis To Maintain Fresh Food
Authors: Michail Milas, Aikaterini Dafni Tegiou, Nickolas Rigopoulos, Eustathios Giaouris, Zaharias Loannou
Abstract:
Nanotechnology, a relatively new scientific field, addresses the manipulation of nanoscale materials and devices, which are governed by unique properties, and is applied in a wide range of industries, including food packaging. The incorporation of nanoparticles into polymer matrices used for food packaging is a field that is highly researched today. One such combination is silver nanoparticles with polypropylene. In the present study, the synthesis of the silver nanoparticles was carried out by a natural method. In particular, a ripe banana peel extract was used. This method is superior to others as it stands out for its environmental friendliness, high efficiency and low-cost requirement. In particular, a 1.75 mM AgNO₃ silver nitrate solution was used, as well as a BPE concentration of 1.7% v/v, an incubation period of 48 hours at 70°C and a pH of 4.3 and after its preparation, the polypropylene films were soaked in it. For the PP films, random PP spheres were melted at 170-190°C into molds with 0.8cm diameter. This polymer was chosen as it is suitable for plastic parts and reusable plastic containers of various types that are intended to come into contact with food without compromising its quality and safety. The antimicrobial test against Escherichia coli DFSNB1 and Staphylococcus epidermidis DFSNB4 was performed on the films. It appeared that the films with silver nanoparticles had a reduction, at least 100 times, compared to those without silver nanoparticles, in both strains. The limit of detection is the lower limit of the vertical error lines in the presence of nanoparticles, which is 3.11. The main reasons that led to the adsorption of nanoparticles are the porous nature of polypropylene and the adsorption capacity of nanoparticles on the surface of the films due to hydrophobic-hydrophilic forces. The most significant parameters that contributed to the results of the experiment include the following: the stage of ripening of the banana during the preparation of the plant extract, the temperature and residence time of the nanoparticle solution in the oven, the residence time of the polypropylene films in the nanoparticle solution, the number of nanoparticles inoculated on the films and, finally, the time these stayed in the refrigerator so that they could dry and be ready for antimicrobial treatment.Keywords: antimicrobial control, banana peel extract, E. coli, natural synthesis, microbe, plant extract, polypropylene films, S.epidermidis, silver nano, random pp
Procedia PDF Downloads 1821222 Development of Sustainable Wind Speed Forecasting Framework for Wind Energy Farms
Authors: Mohammed Bou-Rabee
Abstract:
The significance of wind energy is rising as the global world shifts toward clean and renewable energy sources. Wind energy generates electricity without releasing greenhouse gases, making it a feasible substitute for fossil fuels. This contributes to the reduction of carbon emissions, mitigates climate change, and enhances air quality. Wind energy, unlike fossil fuels, is a renewable resource. Investing in wind energy allows nations to reduce their reliance on imported fossil fuels, improving their energy security. This technique ensures stable energy costs while safeguarding economies from the volatility of oil and gas markets. Recent technological advancements have markedly decreased the cost of wind energy over the past few decades, establishing it as one of the most cost-effective sources of new electricity in many regions globally. These advancements have significantly enhanced turbine efficiency, augmented energy output, and reduced costs. The fluctuating characteristics of wind energy present an ongoing research challenge that has captivated the whole scientific community. Accurate forecasting of wind energy is essential for effective wind farm operation and management, smart grid stabilization, optimizing energy storage, investment and financial planning, and improved participation in energy markets. The extraction of wind energy depends on several factors, with wind speed being the most critical, as it directly affects the power output of a wind turbine. A wind turbine generates energy exponentially with wind velocity, exhibiting a cubic relationship. In addressing these research challenges, we have developed an efficient wind speed forecasting system employing advanced machine learning (ML) and statistical techniques. We created a hybrid time series forecasting model using an ensemble learning approach that integrates a Light Gradient Boosting Machine (LGBoost), Extreme Gradient Boosting (XGBoost), and Bayesian Linear Regression (BLR). We then utilized the Random Forest (RF) technique for feature selection. The model can predict wind speed with a minimum mean square error (MSE) of 0.096 and a maximum R² score of 0.924.Keywords: wind energy, renewable resource, turbine efficiency, affects power
Procedia PDF Downloads 31221 A Modeling Approach for Blockchain-Oriented Information Systems Design
Abstract:
The blockchain technology is regarded as the most promising technology that has the potential to trigger a technological revolution. However, besides the bitcoin industry, we have not yet seen a large-scale application of blockchain in those domains that are supposed to be impacted, such as supply chain, financial network, and intelligent manufacturing. The reasons not only lie in the difficulties of blockchain implementation, but are also root in the challenges of blockchain-oriented information systems design. As the blockchain members are self-interest actors that belong to organizations with different existing information systems. As they expect different information inputs and outputs of the blockchain application, a common language protocol is needed to facilitate communications between blockchain members. Second, considering the decentralization of blockchain organization, there is not any central authority to organize and coordinate the business processes. Thus, the information systems built on blockchain should support more adaptive business process. This paper aims to address these difficulties by providing a modeling approach for blockchain-oriented information systems design. We will investigate the information structure of distributed-ledger data with conceptual modeling techniques and ontology theories, and build an effective ontology mapping method for the inter-organization information flow and blockchain information records. Further, we will study the distributed-ledger-ontology based business process modeling to support adaptive enterprise on blockchain.Keywords: blockchain, ontology, information systems modeling, business process
Procedia PDF Downloads 4571220 Sexual Behaviors and Condom Attitude among Injecting Drug Users in Hai Phong, Vietnam: Qualitative Findings
Authors: Tanvir Ahmed, Thanh N. Long, Phan T. Huong, Donald E. Stewart
Abstract:
This paper presents views on condom use and the contexts of safe and unsafe sexual practices with different sexual partners and their relationships among Injecting Drug Users (IDUs) in Hai Phong, Vietnam. Fifteen IDUs participated and two local interviewers conducted qualitative semi-structured face-to-face interviews in September-October, 2012 in Vietnamese language. Data were analyzed thematically. Non-protective condom attitudes include negotiate or convince Female Sex Workers (FSW); not realizing risk, importance or necessity; partner doesn’t like, and having extra money/drug from clients. On the other hand, self-awareness, family-consciousness, suspicion of STI presence, fear of getting HIV, and client negotiation sometimes resulted in a safe-sex practice. A thematic diagram was developed to present the relationship (strong/weak) between condom attitude and sexual practice (safe/unsafe) by partner types. The experiences and views reflected in the qualitative information emphasize the heightened need for safe-sex education especially among young IDUs (male/female) highlighting sexual transmission risk.Keywords: AIDS, HIV, injecting drug user, risk behaviors, Vietnam
Procedia PDF Downloads 8601219 Extracting Plowing Forces for Aluminum 6061-T6 Using a Small Number of Drilling Experiments
Authors: Ilige S. Hage, Charbel Y. Seif
Abstract:
Forces measured during cutting operations are generated by the cutting process and include parasitic forces, known as edge forces. A fraction of these measured forces arises from the tertiary cutting zone, such as flank or edge forces. Most machining models are designed for sharp tools; where edge forces represent the portion of the measured forces associated with deviations of the tool from an ideal sharp geometry. Flank forces are challenging to isolate. The most common method involves plotting the force at a constant cutting speed against uncut chip thickness and then extrapolating to zero feed. The resulting positive intercept on the vertical axis is identified as the edge or plowing force. The aim of this research is to identify the effect of tool rake angle and cutting speeds on flank forces and to develop a force model as a function of tool rake angle and cutting speed for predicting plowing forces. Edge forces were identified based on a limited number of drilling experiments using a 10 mm twist drill, where lip edge cutting forces were collected from 2.5 mm pre-cored samples. Cutting lip forces were measured with feed rates varying from 0.04 to 0.64 mm/rev and spindle speeds ranging from 796 to 9868 rpm, at a sampling rate of 200 Hz. By using real-time force measurements as the drill enters the workpiece, this study provides an economical method for analyzing the effect of tool geometry and cutting conditions on generated cutting forces, reducing the number of required experimental setups. As a result, an empirical model predicting parasitic edge forces was developed function of the cutting velocity, tool rake angle, and clearance angle along the lip of the tool, demonstrating strong agreement with edge forces reported in the literature for Aluminum 6061-T6. The model achieved an R2 value of 0.92 and a mean square error of 4%, validating the accuracy of the proposed methodology. The presented methodology leverages variations in machining parameters. This approach contrasts with traditional machining experiments, where the turning process typically serves as the basis for force measurements and each experimental setup is characterized by a single cutting velocity, tool rake angle, and clearance angle.Keywords: drilling, plowing, edge forces, cutting force, torque
Procedia PDF Downloads 121218 Augmenting Classroom Reality
Authors: Kerrin Burnell
Abstract:
In a world of increasingly technology-dependent students, the English language classroom should ideally keep up with developments to keep students engaged as much as possible. Unfortunately, as is the case in Oman, funding is not always adequate to ensure students have the most up to date technology, and most institutions are still reliant on paper-based textbooks. In order to try and bridge the gap between the technology available (smartphones) and textbooks, augmented reality (AR) technology can be utilized to enhance classroom, homework, and extracurricular activities. AR involves overlaying media (videos, images etc) over the top of physical objects (posters, book pages etc) and then sharing the media. This case study involved introducing students to a freely available entry level AR app called Aurasma. Students were asked to augment their English textbooks, word walls, research project posters, and extracurricular posters. Through surveys, interviews and an analysis of time spent accessing the different media, a determination of the appropriateness of the technology for the classroom was determined. Results indicate that the use of AR has positive effects on many aspects of the English classroom. Increased student engagement, total time spent on task, interaction, and motivation were evident, along with a decrease in technology-related anxiety. As it is proving very difficult to get tablets or even laptops in classrooms in Oman, these preliminary results indicate that many positive outcomes will come from introducing students to this innovative technology.Keywords: augmented reality, classroom technology, classroom innovation, engagement
Procedia PDF Downloads 3871217 The Barriers That ESOL Learners Face Accessing Further Education
Authors: Jamie David Hopkin
Abstract:
This study aims to contribute uniquely to help colleges and community learning and development institutes to help aid progression within ESOL learning. The study investigates the barriers that migrant and displaced learners face accessing further education in Scotland. The study also includes a set of recommendations both for colleges and CLD institutes to help ESOL learners in their journey to further education. The research found that integration into Scottish society is one of the biggest motivators for ESOL students to learn English. It also found that the place of gender and “gender roles” contribute to the barriers that learners face in terms of progression and learning. The study also reviews all literature related to ESOL learning in Scotland and found that there are only two main policies that support ESOL learning, and both are slightly outdated in terms of supporting progression. This study aims to help bridge the gap in knowledge around the progression from informal learning to formal education. The recommendations that are made in this study are aimed to help institutes and learners on their journey to a positive destination. The main beneficiaries of this research are current and future ESOL learners in Scotland, ESOL institutes, and TESOL professionals.Keywords: community learning and development, English for speakers of other languages, further education, higher education TESOL, teaching English as a second language
Procedia PDF Downloads 1461216 Nuclear Safety and Security in France in the 1970s: A Turning Point for the Media
Authors: Jandot Aurélia
Abstract:
In France, in the main media, the concern about nuclear safety and security has not really appeared before the beginning of the 1970s. The gradual changes in its perception are studied here through the arguments given in the main French news magazines, linked with several parameters. As this represents a considerable amount of copies and thus of information, are selected here the main articles as well as the main “mental images” aiming to persuade the readers and which have led the public awareness to evolve. Indeed, in the 1970s, in France, these evolutions were not made in one day. Indeed, over the period, many articles were still in favor of nuclear power plants and promoted the technological advances that were made in this field. They had to be taken into account. But, gradually, grew up arguments and mental images discrediting the perception of nuclear technology. Among these were the environmental impacts of this industry, as the question of pollution progressively appeared. So, between 1970 and 1979, the language has changed, as the perceptible objectives of the communication, allowing to discern the deepest intentions of the editorial staffs of the French news magazines. This is all these changes that are emphasized here, over a period when the safety and security concern linked to the nuclear technology, to there a field for specialists, has become progressively a social issue seemingly open to all.Keywords: environmental impacts, media discourse, nuclear security, public awareness
Procedia PDF Downloads 2861215 Discipline-Specific Culture: A Purpose-Based Investigation
Authors: Sihem Benaouda
Abstract:
English is gaining an international identity as it affects every academic and professional field in the world. Without increasing their cultural understanding, it would obviously be difficult to completely educate learners for communication in a globalised environment. The concept of culture is intricate and needs to be elucidated, especially in an English language teaching (ELT) context. The study focuses on the investigation of the cultural studies integrated into the different types of English for specific purposes (ESP) materials, as opposed to English for general purposes (EGP) textbooks. A qualitative methodology based on a triangulation of techniques was conducted through materials analysis of five textbooks in both advanced EGP and three types of ESP. In addition to a semi-structured interview conducted with Algerian ESP practitioners, data analysis results revealed that culture in ESP textbooks is not overtly isolated into chapters and that cultural studies are predominantly present in business and economics materials, namely English for hotel and catering staff, tourism, and flight attendants. However, implicit cultural instruction is signalled in the social sciences and is negligible in science and technology sources. In terms of content, cultural studies in EGP are more related to generic topics, whereas, in some ESP materials, the topics are rather oriented to the specific field they belong to. Furthermore, the respondents’ answers showed an unawareness of the importance of culture in ESP teaching, besides some disregard for culture teaching per se in ESP contexts.Keywords: ESP, EGP, cultural studies, textbooks, teaching, materials
Procedia PDF Downloads 1111214 Development of a Data-Driven Method for Diagnosing the State of Health of Battery Cells, Based on the Use of an Electrochemical Aging Model, with a View to Their Use in Second Life
Authors: Desplanches Maxime
Abstract:
Accurate estimation of the remaining useful life of lithium-ion batteries for electronic devices is crucial. Data-driven methodologies encounter challenges related to data volume and acquisition protocols, particularly in capturing a comprehensive range of aging indicators. To address these limitations, we propose a hybrid approach that integrates an electrochemical model with state-of-the-art data analysis techniques, yielding a comprehensive database. Our methodology involves infusing an aging phenomenon into a Newman model, leading to the creation of an extensive database capturing various aging states based on non-destructive parameters. This database serves as a robust foundation for subsequent analysis. Leveraging advanced data analysis techniques, notably principal component analysis and t-Distributed Stochastic Neighbor Embedding, we extract pivotal information from the data. This information is harnessed to construct a regression function using either random forest or support vector machine algorithms. The resulting predictor demonstrates a 5% error margin in estimating remaining battery life, providing actionable insights for optimizing usage. Furthermore, the database was built from the Newman model calibrated for aging and performance using data from a European project called Teesmat. The model was then initialized numerous times with different aging values, for instance, with varying thicknesses of SEI (Solid Electrolyte Interphase). This comprehensive approach ensures a thorough exploration of battery aging dynamics, enhancing the accuracy and reliability of our predictive model. Of particular importance is our reliance on the database generated through the integration of the electrochemical model. This database serves as a crucial asset in advancing our understanding of aging states. Beyond its capability for precise remaining life predictions, this database-driven approach offers valuable insights for optimizing battery usage and adapting the predictor to various scenarios. This underscores the practical significance of our method in facilitating better decision-making regarding lithium-ion battery management.Keywords: Li-ion battery, aging, diagnostics, data analysis, prediction, machine learning, electrochemical model, regression
Procedia PDF Downloads 741213 Internet Memes: A Mirror of Culture and Society
Authors: Alexandra-Monica Toma
Abstract:
As the internet became a ruling force of society, computer-mediated communication has enriched its methods to convey meaning by combining linguistic means to visual means of expressivity. One of the elements of cyberspace is what we call a meme, a succinct, visually engaging tool used to communicate ideas or emotions, usually in a funny or ironic manner. Coined by Richard Dawkings in the late 1970s to refer to cultural genes, this term now denominates a special type of vernacular language used to share content on the internet. This research aims to analyse the basic mechanism that stands at the basis of meme creation as a blend of innovation and imitation and will approach some of the most widely used image macros remixed to generate new content while also pointing out success strategies. Moreover, this paper discusses whether memes can transcend the light-hearted and playful mood they mirror and become biting and sharp cultural comments. The study also uses the concept of multimodality and stresses how the text interacts with image, discussing three types of relations between the two: symmetry, amplification, and contradiction. We will furthermore show that memes are cultural artifacts and virtual tropes highly dependent on context and societal issues by using a corpus of memes created related to the COVID-19 pandemic.Keywords: context, computer-mediated communication, memes, multimodality
Procedia PDF Downloads 188