Search results for: variable precision rough sets theory
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8825

Search results for: variable precision rough sets theory

5195 The Predictive Power of Successful Scientific Theories: An Explanatory Study on Their Substantive Ontologies through Theoretical Change

Authors: Damian Islas

Abstract:

Debates on realism in science concern two different questions: (I) whether the unobservable entities posited by theories can be known; and (II) whether any knowledge we have of them is objective or not. Question (I) arises from the doubt that since observation is the basis of all our factual knowledge, unobservable entities cannot be known. Question (II) arises from the doubt that since scientific representations are inextricably laden with the subjective, idiosyncratic, and a priori features of human cognition and scientific practice, they cannot convey any reliable information on how their objects are in themselves. A way of understanding scientific realism (SR) is through three lines of inquiry: ontological, semantic, and epistemological. Ontologically, scientific realism asserts the existence of a world independent of human mind. Semantically, scientific realism assumes that theoretical claims about reality show truth values and, thus, should be construed literally. Epistemologically, scientific realism believes that theoretical claims offer us knowledge of the world. Nowadays, the literature on scientific realism has proceeded rather far beyond the realism versus antirealism debate. This stance represents a middle-ground position between the two according to which science can attain justified true beliefs concerning relational facts about the unobservable realm but cannot attain justified true beliefs concerning the intrinsic nature of any objects occupying that realm. That is, the structural content of scientific theories about the unobservable can be known, but facts about the intrinsic nature of the entities that figure as place-holders in those structures cannot be known. There are two possible versions of SR: Epistemological Structural Realism (ESR) and Ontic Structural Realism (OSR). On ESR, an agnostic stance is preserved with respect to the natures of unobservable entities, but the possibility of knowing the relations obtaining between those entities is affirmed. OSR includes the rather striking claim that when it comes to the unobservables theorized about within fundamental physics, relations exist, but objects do not. Focusing on ESR, questions arise concerning its ability to explain the empirical success of a theory. Empirical success certainly involves predictive success, and predictive success implies a theory’s power to make accurate predictions. But a theory’s power to make any predictions at all seems to derive precisely from its core axioms or laws concerning unobservable entities and mechanisms, and not simply the sort of structural relations often expressed in equations. The specific challenge to ESR concerns its ability to explain the explanatory and predictive power of successful theories without appealing to their substantive ontologies, which are often not preserved by their successors. The response to this challenge will depend on the various and subtle different versions of ESR and OSR stances, which show a sort of progression through eliminativist OSR to moderate OSR of gradual increase in the ontological status accorded to objects. Knowing the relations between unobserved entities is methodologically identical to assert that these relations between unobserved entities exist.

Keywords: eliminativist ontic structural realism, epistemological structuralism, moderate ontic structural realism, ontic structuralism

Procedia PDF Downloads 117
5194 Effects of Variable Viscosity on Radiative MHD Flow in a Porous Medium Between Twovertical Wavy Walls

Authors: A. B. Disu, M. S. Dada

Abstract:

This study was conducted to investigate two dimensional heat transfer of a free convective-radiative MHD (Magneto-hydrodynamics) flow with temperature dependent viscosity and heat source of a viscous incompressible fluid in a porous medium between two vertical wavy walls. The fluid viscosity is assumed to vary as an exponential function of temperature. The flow is assumed to consist of a mean part and a perturbed part. The perturbed quantities were expressed in terms of complex exponential series of plane wave equation. The resultant differential equations were solved by Differential Transform Method (DTM). The numerical computations were presented graphically to show the salient features of the fluid flow and heat transfer characteristics. The skin friction and Nusselt number were also analyzed for various governing parameters.

Keywords: differential transform method, MHD free convection, porous medium, two dimensional radiation, two wavy walls

Procedia PDF Downloads 446
5193 Perceptions of Community Members in Lephalale Area, Limpopo Province, Towards Water Conservation: Development of a Psychological Model

Authors: M. L. Seretlo-Rangata, T. Sodi, S. Govender

Abstract:

Despite interventions by various governments to regulate water demand and address water scarcity, literature shows that billions of people across the world continue to struggle with access because not everyone contributes equally to conservation efforts. Behavioral factors such as individual and collective aspects of cognition and commitment have been found to play an important role in water conservation. The aim of the present study was to explore the perceptions of community members in the Lephalale area, Limpopo province, towards water conservation with a view to developing an explanatory psychological model on water conservation. Twenty (20) participants who relied on communal taps to access water in Lephalale Local Municipality, Limpopo province, were selected through purposeful sampling. In-depth, semi-structured, individual face-to-face interviews were used to gather data and were analyzed utilizing thematic content analysis (TCA). The research findings revealed that there are various psychological effects of water scarcity on communities, such as emotional distress, interpersonal conflicts and disruptions of daily activities of living. Additionally, the study results showed that the coping strategies developed by participants to deal with water scarcity included adopting alternative water use behaviors as well as adjusting current behaviors and lifestyles. Derived from the study findings, a psychological model of water conservation was developed. The model incorporates some ideas from the Value-Belief-Norm (VBN) theory and the Afrocentric theory. The model suggests that people’s worldviews, including their values, beliefs and culture, are significant determinants of their pro-environmental behaviors. The study concludes by recommending that authorities and policymakers should consider psychological factors when developing water management programs, strategies and interventions with the consultation of psychology experts.

Keywords: water conservation, psychological model, pro-environmental behaviour, conservation psychology, water-use behaviour

Procedia PDF Downloads 70
5192 AM/E/c Queuing Hub Maximal Covering Location Model with Fuzzy Parameter

Authors: M. H. Fazel Zarandi, N. Moshahedi

Abstract:

The hub location problem appears in a variety of applications such as medical centers, firefighting facilities, cargo delivery systems and telecommunication network design. The location of service centers has a strong influence on the congestion at each of them, and, consequently, on the quality of service. This paper presents a fuzzy maximal hub covering location problem (FMCHLP) in which travel costs between any pair of nodes is considered as a fuzzy variable. In order to consider the quality of service, we model each hub as a queue. Arrival rate follows Poisson distribution and service rate follows Erlang distribution. In this paper, at first, a nonlinear mathematical programming model is presented. Then, we convert it to the linear one. We solved the linear model using GAMS software up to 25 nodes and for large sizes due to the complexity of hub covering location problems, and simulated annealing algorithm is developed to solve and test the model. Also, we used possibilistic c-means clustering method in order to find an initial solution.

Keywords: fuzzy modeling, location, possibilistic clustering, queuing

Procedia PDF Downloads 391
5191 The Professionalization of Teachers in the Context of the Development of a Future-Oriented Technical and Vocational Education and Training System in Egypt

Authors: Sherin Ahmed El-Badry Sadek

Abstract:

In this research, it is scientifically examined what contribution the professionalization of teachers can make to the development of a future-oriented vocational education and training system in Egypt. For this purpose, a needs assessment of the Egyptian vocational training system with the central actors and prevailing structures forms the foundation of the study, which theoretically underpinned with the attempt to resolve to some extent the tension between Luhmann's systems theory approach and the actor-centered theory of professional teacher competence. The vocational education system, in particular, must be adaptable and flexible due to the rapidly changing qualification requirements. In view of the pace of technological progress and the associated market changes, vocational training is no longer to be understood only as an educational tool aimed at those who achieve poorer academic performance or are not motivated to take up a degree. Rather, it is to be understood as a cornerstone for the development of society, and international experience shows that it is the core of lifelong learning. But to what extent have the education systems been able to react to these changes in their political, social, and technological systems? And how effective and sustainable are these changes actually? The vocational training system, in particular, has a particular impact on other social systems, which is why the appropriate parameters with the greatest leverage must be identified and adapted. Even if systems and structures are highly relevant, teachers must not hide behind them and must instead strive to develop further and to constantly learn. Despite numerous initiatives and programs to reform vocational training in Egypt, including the EU-funded Technical and Vocational Education and Training (TVET) reform phase I and phase II, the fit of the skilled workers to the needs of the labor market is still insufficient. Surveys show that the majority of employers are very dissatisfied with the graduates that the vocational training system produces. The data was collected through guideline-based interviews with experts from the education system and relevant neighboring systems, which allowed me to reconstruct central in-depth structures, as well as patterns of action and interpretation, in order to subsequently feed these into a matrix of recommendations for action. These recommendations are addressed to different decision-makers and stakeholders and are intended to serve as an impetus for the sustainable improvement of the Egyptian vocational training system. The research findings have shown that education, and in particular vocational training, is a political field that is characterized by a high degree of complexity and which is embedded in a barely manageable, highly branched landscape of structures and actors. At the same time, the vocational training system is not only determined by endogenous factors but also increasingly shaped by the dynamics of the environment and the neighboring social subsystems, with a mutual dependency relationship becoming apparent. These interactions must be taken into account in all decisions, even if prioritization of measures and thus a clear sequence and process orientation are of great urgency.

Keywords: competence orientation, educational policies, education systems, expert interviews, globalization, organizational development, professionalization, systems theory, teacher training, TVET system, vocational training

Procedia PDF Downloads 150
5190 Predictive Analysis of Chest X-rays Using NLP and Large Language Models with the Indiana University Dataset and Random Forest Classifier

Authors: Azita Ramezani, Ghazal Mashhadiagha, Bahareh Sanabakhsh

Abstract:

This study researches the combination of Random. Forest classifiers with large language models (LLMs) and natural language processing (NLP) to improve diagnostic accuracy in chest X-ray analysis using the Indiana University dataset. Utilizing advanced NLP techniques, the research preprocesses textual data from radiological reports to extract key features, which are then merged with image-derived data. This improved dataset is analyzed with Random Forest classifiers to predict specific clinical results, focusing on the identification of health issues and the estimation of case urgency. The findings reveal that the combination of NLP, LLMs, and machine learning not only increases diagnostic precision but also reliability, especially in quickly identifying critical conditions. Achieving an accuracy of 99.35%, the model shows significant advancements over conventional diagnostic techniques. The results emphasize the large potential of machine learning in medical imaging, suggesting that these technologies could greatly enhance clinician judgment and patient outcomes by offering quicker and more precise diagnostic approximations.

Keywords: natural language processing (NLP), large language models (LLMs), random forest classifier, chest x-ray analysis, medical imaging, diagnostic accuracy, indiana university dataset, machine learning in healthcare, predictive modeling, clinical decision support systems

Procedia PDF Downloads 42
5189 Probabilistic and Stochastic Analysis of a Retaining Wall for C-Φ Soil Backfill

Authors: André Luís Brasil Cavalcante, Juan Felix Rodriguez Rebolledo, Lucas Parreira de Faria Borges

Abstract:

A methodology for the probabilistic analysis of active earth pressure on retaining wall for c-Φ soil backfill is described in this paper. The Rosenblueth point estimate method is used to measure the failure probability of a gravity retaining wall. The basic principle of this methodology is to use two point estimates, i.e., the standard deviation and the mean value, to examine a variable in the safety analysis. The simplicity of this framework assures to its wide application. For the calculation is required 2ⁿ repetitions during the analysis, since the system is governed by n variables. In this study, a probabilistic model based on the Rosenblueth approach for the computation of the overturning probability of failure of a retaining wall is presented. The obtained results have shown the advantages of this kind of models in comparison with the deterministic solution. In a relatively easy way, the uncertainty on the wall and fill parameters are taken into account, and some practical results can be obtained for the retaining structure design.

Keywords: retaining wall, active earth pressure, backfill, probabilistic analysis

Procedia PDF Downloads 417
5188 The Transformation of Beauty and Ugliness in Art Aesthetics

Authors: Wenjin Li, Jing Sun

Abstract:

Since the mid-eighteenth century, when philosophers began to talk about aesthetic consciousness, beauty gradually became an independent system, and ugliness was often considered the opposite of beauty and neglected. Yet, ugliness itself has its self-regulation and aesthetic value. This paper explores the relationship between beauty and ugliness and the three forms of transformation between beauty and ugliness in artworks, looking at the history of ugliness in the East and the West, and giving an insight into the role of the artist in the transformation between beauty and ugliness.

Keywords: artistic aesthetics, arts theory, aesthetic of the ugly, beauty and ugliness, arts of east and west

Procedia PDF Downloads 83
5187 A Sustainable Supplier Selection and Order Allocation Based on Manufacturing Processes and Product Tolerances: A Multi-Criteria Decision Making and Multi-Objective Optimization Approach

Authors: Ravi Patel, Krishna K. Krishnan

Abstract:

In global supply chains, appropriate and sustainable suppliers play a vital role in supply chain development and feasibility. In a larger organization with huge number of suppliers, it is necessary to divide suppliers based on their past history of quality and delivery of each product category. Since performance of any organization widely depends on their suppliers, well evaluated selection criteria and decision-making models lead to improved supplier assessment and development. In this paper, SCOR® performance evaluation approach and ISO standards are used to determine selection criteria for better utilization of supplier assessment by using hybrid model of Analytic Hierchchy Problem (AHP) and Fuzzy Techniques for Order Preference by Similarity to Ideal Solution (FTOPSIS). AHP is used to determine the global weightage of criteria which helps TOPSIS to get supplier score by using triangular fuzzy set theory. Both qualitative and quantitative criteria are taken into consideration for the proposed model. In addition, a multi-product and multi-time period model is selected for order allocation. The optimization model integrates multi-objective integer linear programming (MOILP) for order allocation and a hybrid approach for supplier selection. The proposed MOILP model optimizes order allocation based on manufacturing process and product tolerances as per manufacturer’s requirement for quality product. The integrated model and solution approach are tested to find optimized solutions for different scenario. The detailed analysis shows the superiority of proposed model over other solutions which considered individual decision making models.

Keywords: AHP, fuzzy set theory, multi-criteria decision making, multi-objective integer linear programming, TOPSIS

Procedia PDF Downloads 169
5186 CFD Modeling and Optimization of Gas Cyclone Separator for Performance Improvement

Authors: N. Beit Saeid

Abstract:

Cyclones are used in the field of air industrial gases pollution and control the pollution with centrifugal forces that is generated with spatial geometry of the cyclone. Their simple design, low capital and maintenance costs and adaptability to a wide range of operating conditions have made cyclones one of the most widely used industrial dust collectors. Their cost of operation is proportional to the fan energy required to overcome their pressure drop. Optimized geometry of outlet diffuser of the cyclones potentially could reduce exit pressure losses without affecting collection efficiency. Three rectangular outlets and a radial outlet with a variable opening had been analyzed on two cyclones. Pressure drop was investigated for inlet velocities from about 10 to 20 m s−1. The radial outlet reduced cyclone pressure drop by between 8.7 and 11.9 percent when its exit area was equal to the flow area of the cyclone vortex finder or gas exit. A simple payback based on avoided energy costs was estimated to be between 3600 and 5000 h, not including installation cost.

Keywords: cyclone, CFD, optimization, genetic algorithm

Procedia PDF Downloads 380
5185 A New Method to Estimate the Low Income Proportion: Monte Carlo Simulations

Authors: Encarnación Álvarez, Rosa M. García-Fernández, Juan F. Muñoz

Abstract:

Estimation of a proportion has many applications in economics and social studies. A common application is the estimation of the low income proportion, which gives the proportion of people classified as poor into a population. In this paper, we present this poverty indicator and propose to use the logistic regression estimator for the problem of estimating the low income proportion. Various sampling designs are presented. Assuming a real data set obtained from the European Survey on Income and Living Conditions, Monte Carlo simulation studies are carried out to analyze the empirical performance of the logistic regression estimator under the various sampling designs considered in this paper. Results derived from Monte Carlo simulation studies indicate that the logistic regression estimator can be more accurate than the customary estimator under the various sampling designs considered in this paper. The stratified sampling design can also provide more accurate results.

Keywords: poverty line, risk of poverty, auxiliary variable, ratio method

Procedia PDF Downloads 454
5184 The Drama and Dynamics of Economic Shocks and Households Responses in Nigeria

Authors: Doki Naomi Onyeje, Doki Gowon Ama

Abstract:

The past 4 years have been traumatic for Nigerians, having to deal with a number of complex economic issues with dire consequences for the economy. Households have had to respond variously to some of these problems in peculiar ways, depending, of course, on the nature and character of a particular shock. The type, magnitude, intensity and duration of a particular shock might be the determinant of different household responses. While households’ responses to the Global Financial Crisis and Covid 19 Pandemic have been documented by researchers, other economic shocks have continued to emerge in Nigeria. The dramatic turn of events since coming on board of the new government on May 29th 2023, has introduced a new economic twist that households will have to adjust to. This study, therefore, sets out to examine household responses by disaggregating them by their livelihood sources. A survey of 420 households across North Central Nigeria will be done to generate information on the respective responses. A Multinomial logit regression analysis will be employed to test the hypothesis that livelihood source(s) influences household responses to economic shocks. Consequently, responses from public and private households will be examined. The expected results should be that household responses might have some similarities, but it is expected that some peculiar responses across groups will emerge and these differences will guide for group-specific interventions. The Theatre for Development (TfD) approach will be used to disseminate and propagate results from this study to and among stakeholders for effective policy frameworks.

Keywords: drama, dynamics, economic shocks, household responses, Nigeria

Procedia PDF Downloads 72
5183 Managing the Transition from Voluntary to Mandatory Climate Reporting: The Role of Carbon Accounting

Authors: Qingliang Tang

Abstract:

The transition from voluntary to mandatory carbon reporting (also refers to climate reporting) poses serious challenges for accounting professionals aiming to support firms in achieving net-zero goals. The accounting literature addresses the topics that are currently bewildering accounting academics and professional accountants on how to make accounting as a useful tool for the management to achieve a carbon neutral business model. This paper explores the evolving role of carbon accounting within corporate financial reporting systems, emphasizing its integration as a crucial component. Key challenges addressed include data availability, climate risk assessment, defining reporting boundaries, selecting appropriate greenhouse gas (GHG) accounting methodologies, and integrating climate-related events into traditional financial statements. A dynamic, integrated carbon accounting framework is proposed to facilitate this transformative process effectively. Furthermore, the paper identifies critical knowledge gaps and sets forth a research agenda aimed at enhancing transparency and relevance in carbon accounting and reporting systems, thereby empowering informed decision-making. The purpose of the paper is to succinctly capture the essence of carbon accounting practice in the transitional period, focusing on the challenges, proposed solutions, and future research directions in the realm of carbon accounting and mandatory climate reporting.

Keywords: mandatory carbon reporting, carbon management, net zero target, sustainability, climate risks

Procedia PDF Downloads 15
5182 Multivariate Analytical Insights into Spatial and Temporal Variation in Water Quality of a Major Drinking Water Reservoir

Authors: Azadeh Golshan, Craig Evans, Phillip Geary, Abigail Morrow, Zoe Rogers, Marcel Maeder

Abstract:

22 physicochemical variables have been determined in water samples collected weekly from January to December in 2013 from three sampling stations located within a major drinking water reservoir. Classical Multivariate Curve Resolution Alternating Least Squares (MCR-ALS) analysis was used to investigate the environmental factors associated with the physico-chemical variability of the water samples at each of the sampling stations. Matrix augmentation MCR-ALS (MA-MCR-ALS) was also applied, and the two sets of results were compared for interpretative clarity. Links between these factors, reservoir inflows and catchment land-uses were investigated and interpreted in relation to chemical composition of the water and their resolved geographical distribution profiles. The results suggested that the major factors affecting reservoir water quality were those associated with agricultural runoff, with evidence of influence on algal photosynthesis within the water column. Water quality variability within the reservoir was also found to be strongly linked to physical parameters such as water temperature and the occurrence of thermal stratification. The two methods applied (MCR-ALS and MA-MCR-ALS) led to similar conclusions; however, MA-MCR-ALS appeared to provide results more amenable to interpretation of temporal and geological variation than those obtained through classical MCR-ALS.

Keywords: drinking water reservoir, multivariate analysis, physico-chemical parameters, water quality

Procedia PDF Downloads 291
5181 Performance and Voyage Analysis of Marine Gas Turbine Engine, Installed to Power and Propel an Ocean-Going Cruise Ship from Lagos to Jeddah

Authors: Mathias U. Bonet, Pericles Pilidis, Georgios Doulgeris

Abstract:

An aero-derivative marine Gas Turbine engine model is simulated to be installed as the main propulsion prime mover to power a cruise ship which is designed and routed to transport intending Muslim pilgrims for the annual hajj pilgrimage from Nigeria to the Islamic port city of Jeddah in Saudi Arabia. A performance assessment of the Gas Turbine engine has been conducted by examining the effect of varying aerodynamic and hydrodynamic conditions encountered at various geographical locations along the scheduled transit route during the voyage. The investigation focuses on the overall behavior of the Gas Turbine engine employed to power and propel the ship as it operates under ideal and adverse conditions to be encountered during calm and rough weather according to the different seasons of the year under which the voyage may be undertaken. The variation of engine performance under varying operating conditions has been considered as a very important economic issue by determining the time the speed by which the journey is completed as well as the quantity of fuel required for undertaking the voyage. The assessment also focuses on the increased resistance caused by the fouling of the submerged portion of the ship hull surface with its resultant effect on the power output of the engine as well as the overall performance of the propulsion system. Daily ambient temperature levels were obtained by accessing data from the UK Meteorological Office while the varying degree of turbulence along the transit route and according to the Beaufort scale were also obtained as major input variables of the investigation. By assuming the ship to be navigating the Atlantic Ocean and the Mediterranean Sea during winter, spring and summer seasons, the performance modeling and simulation was accomplished through the use of an integrated Gas Turbine performance simulation code known as ‘Turbomach’ along with a Matlab generated code named ‘Poseidon’, all of which have been developed at the Power and Propulsion Department of Cranfield University. As a case study, the results of the various assumptions have further revealed that the marine Gas Turbine is a reliable and available alternative to the conventional marine propulsion prime movers that have dominated the maritime industry before now. The techno-economic and environmental assessment of this type of propulsion prime mover has enabled the determination of the effect of changes in weather and sea conditions on the ship speed as well as trip time and the quantity of fuel required to be burned throughout the voyage.

Keywords: ambient temperature, hull fouling, marine gas turbine, performance, propulsion, voyage

Procedia PDF Downloads 185
5180 Approach for Demonstrating Reliability Targets for Rail Transport during Low Mileage Accumulation in the Field: Methodology and Case Study

Authors: Nipun Manirajan, Heeralal Gargama, Sushil Guhe, Manoj Prabhakaran

Abstract:

In railway industry, train sets are designed based on contractual requirements (mission profile), where reliability targets are measured in terms of mean distance between failures (MDBF). However, during the beginning of revenue services, trains do not achieve the designed mission profile distance (mileage) within the timeframe due to infrastructure constraints, scarcity of commuters or other operational challenges thereby not respecting the original design inputs. Since trains do not run sufficiently and do not achieve the designed mileage within the specified time, car builder has a risk of not achieving the contractual MDBF target. This paper proposes a constant failure rate based model to deal with the situations where mileage accumulation is not a part of the design mission profile. The model provides appropriate MDBF target to be demonstrated based on actual accumulated mileage. A case study of rolling stock running in the field is undertaken to analyze the failure data and MDBF target demonstration during low mileage accumulation. The results of case study prove that with the proposed method, reliability targets are achieved under low mileage accumulation.

Keywords: mean distance between failures, mileage-based reliability, reliability target appropriations, rolling stock reliability

Procedia PDF Downloads 265
5179 Study of Half-Metallic Ferromagnetism in CeFeO3

Authors: A. Abbad, W. Benstaali

Abstract:

Using first-principles calculations based on the density functional theory and generalize gradient approximation, we predict electronic and magnetic properties of CeFeO3 orthorhombic perovskite. The calculated densities of states presented in this study identify the metallic behavior CeFeO3 when we use the GGA scheme, whereas when we use the GGA+U, we see that its exhibits half-metallic character with an integer magnetic moment of 24μB per formula unit at its equilibrium volume which makes this compound promising candidate for applications in spintronics.

Keywords: CeFeO3, magnetic moment, half-metallic, electronic properties

Procedia PDF Downloads 368
5178 Fact-checking and Political Polarization in an Emerging Democracy

Authors: Eric Agyekum, Dominic Asitanga

Abstract:

Ghana is widely considered asa beacon of democracy in sub-Saharan Africa. With a relatively free media, the country was ranked30thin the world and third in Africaon the 2021 Press Freedom Index. Despite the democratic gains, itis one of the most politically polarized nations in the world. Ghana’spolitical division is evident in the current hunglegislature, where each of the two dominant political parties has 137 members, with an independent member occupying the remaining one seat. Misinformation and fake newsthrive in systems with acuteideological and political differences(Imelda et al, 2021; Azzimonti&Fernandes, 2018; Spohr, 2017) and Ghana is no exception. The information disorder problem has been exacerbatedby the COVID-19 pandemic, with its attendant conspiracy theories and speculations, making it difficult for the media and fact-checking organizations to verifyall claims and flag false information. In Ghana, fact-checking agencies like Ghana Fact, Dubawa Ghana, and some mainstream news media organizations have been fact-checking political claims, COVID-19 conspiracy theories, and many others. However, it is not clear if the audience consumeand attach prominence to these fact-checked stories or even visit the websites of the fact-checking agencies to read the content. Nekmat (2020) opine that though the literature on fact-checking suggest that fact-checked stories can alter readers’ beliefs, very few studies have investigated the patronage and the potential of fact-checks to deter users from sharing false news with others, particularly on social media. In response to Nekmat, this study has been initiated to examine the perception and attitude of the audience in Ghana towards fact-checks. Anchored on the principles of the nudge theory, this study will investigate how fact-checked stories alters readers’ behavioural patterns. A survey will be conducted to collect data from sampled members of the Ghanaian society.

Keywords: fact-checking, information disorder, nudge theory, political polarization

Procedia PDF Downloads 139
5177 Studies of Rule Induction by STRIM from the Decision Table with Contaminated Attribute Values from Missing Data and Noise — in the Case of Critical Dataset Size —

Authors: Tetsuro Saeki, Yuichi Kato, Shoutarou Mizuno

Abstract:

STRIM (Statistical Test Rule Induction Method) has been proposed as a method to effectively induct if-then rules from the decision table which is considered as a sample set obtained from the population of interest. Its usefulness has been confirmed by simulation experiments specifying rules in advance, and by comparison with conventional methods. However, scope for future development remains before STRIM can be applied to the analysis of real-world data sets. The first requirement is to determine the size of the dataset needed for inducting true rules, since finding statistically significant rules is the core of the method. The second is to examine the capacity of rule induction from datasets with contaminated attribute values created by missing data and noise, since real-world datasets usually contain such contaminated data. This paper examines the first problem theoretically, in connection with the rule length. The second problem is then examined in a simulation experiment, utilizing the critical size of dataset derived from the first step. The experimental results show that STRIM is highly robust in the analysis of datasets with contaminated attribute values, and hence is applicable to realworld data.

Keywords: rule induction, decision table, missing data, noise

Procedia PDF Downloads 395
5176 Features Vector Selection for the Recognition of the Fragmented Handwritten Numeric Chains

Authors: Salim Ouchtati, Aissa Belmeguenai, Mouldi Bedda

Abstract:

In this study, we propose an offline system for the recognition of the fragmented handwritten numeric chains. Firstly, we realized a recognition system of the isolated handwritten digits, in this part; the study is based mainly on the evaluation of neural network performances, trained with the gradient backpropagation algorithm. The used parameters to form the input vector of the neural network are extracted from the binary images of the isolated handwritten digit by several methods: the distribution sequence, sondes application, the Barr features, and the centered moments of the different projections and profiles. Secondly, the study is extended for the reading of the fragmented handwritten numeric chains constituted of a variable number of digits. The vertical projection was used to segment the numeric chain at isolated digits and every digit (or segment) was presented separately to the entry of the system achieved in the first part (recognition system of the isolated handwritten digits).

Keywords: features extraction, handwritten numeric chains, image processing, neural networks

Procedia PDF Downloads 265
5175 An Enhanced MEIT Approach for Itemset Mining Using Levelwise Pruning

Authors: Tanvi P. Patel, Warish D. Patel

Abstract:

Association rule mining forms the core of data mining and it is termed as one of the well-known methodologies of data mining. Objectives of mining is to find interesting correlations, frequent patterns, associations or casual structures among sets of items in the transaction databases or other data repositories. Hence, association rule mining is imperative to mine patterns and then generate rules from these obtained patterns. For efficient targeted query processing, finding frequent patterns and itemset mining, there is an efficient way to generate an itemset tree structure named Memory Efficient Itemset Tree. Memory efficient IT is efficient for storing itemsets, but takes more time as compare to traditional IT. The proposed strategy generates maximal frequent itemsets from memory efficient itemset tree by using levelwise pruning. For that firstly pre-pruning of items based on minimum support count is carried out followed by itemset tree reconstruction. By having maximal frequent itemsets, less number of patterns are generated as well as tree size is also reduced as compared to MEIT. Therefore, an enhanced approach of memory efficient IT proposed here, helps to optimize main memory overhead as well as reduce processing time.

Keywords: association rule mining, itemset mining, itemset tree, meit, maximal frequent pattern

Procedia PDF Downloads 368
5174 Intentions and Willingness of Marketing Professionals to Adopt Neuromarketing

Authors: Anka Gorgiev, Chris Martin, Nikolaos Dimitriadis, Dimitrios V. Nikolaidis

Abstract:

This paper is part of a doctoral research study aimed to identify behavioral indicators for the existence of the new marketing paradigm. Neuromarketing is becoming a growing trend in the marketing industry worldwide and it is capturing a lot of interest among the members of academia and the practitioner community. However, it is still not very clear how big of an impact neuromarketing might have in the following years. In an effort to get closer to an answer, this study investigates behavioral intentions and willingness to adopt neuromarketing and its practices by the marketing professionals, including academics, practitioners, students, researchers, experts and journal editors. The participants in the study include marketing professionals at different levels of neuromarketing fluency with residency in the United States of America and the South East Europe. The total of 19 participants participated in the interviews, all of whom belong to more than one group of marketing professionals. The authors use qualitative research approach and open-ended interview questions specifically developed to assess ideas, beliefs and opinions that marketing professionals hold towards neuromarketing. In constructing the interview questions, the authors have used the theory of planned behavior, the prototype willingness model and the technology acceptance model as a theoretical framework. Previous studies have not explicitly investigated the behavioral intentions of marketing professionals to engage in neuromarketing behavior, which is described here as a tendency to apply neuromarketing assumptions and tools in usual marketing practices. This study suggests that the marketing professionals believe that neuromarketing can contribute to the business in a positive way and outlines the main advantages and disadvantages of adopting neuromarketing as identified by the participants. In addition, the study reveals an emerging image of an exemplar company that is perceived to be using neuromarketing, including the most common characteristics and attributes. These findings are believed to be crucial in facilitating a way for neuromarketing field to have a broader impact than it currently does by recognizing and understanding the limitations that such exemplars imply and how that has an effect on the decision-making of marketing professionals.

Keywords: behavioral intentions, marketing paradigm, neuromarketing adoption, theory of planned behavior

Procedia PDF Downloads 171
5173 Integrating Machine Learning and Rule-Based Decision Models for Enhanced B2B Sales Forecasting and Customer Prioritization

Authors: Wenqi Liu, Reginald Bailey

Abstract:

This study explores an advanced approach to enhancing B2B sales forecasting by integrating machine learning models with a rule-based decision framework. The methodology begins with the development of a machine learning classification model to predict conversion likelihood, aiming to improve accuracy over traditional methods like logistic regression. The classification model's effectiveness is measured using metrics such as accuracy, precision, recall, and F1 score, alongside a feature importance analysis to identify key predictors. Following this, a machine learning regression model is used to forecast sales value, with the objective of reducing mean absolute error (MAE) compared to linear regression techniques. The regression model's performance is assessed using MAE, root mean square error (RMSE), and R-squared metrics, emphasizing feature contribution to the prediction. To bridge the gap between predictive analytics and decision-making, a rule-based decision model is introduced that prioritizes customers based on predefined thresholds for conversion probability and predicted sales value. This approach significantly enhances customer prioritization and improves overall sales performance by increasing conversion rates and optimizing revenue generation. The findings suggest that this combined framework offers a practical, data-driven solution for sales teams, facilitating more strategic decision-making in B2B environments.

Keywords: sales forecasting, machine learning, rule-based decision model, customer prioritization, predictive analytics

Procedia PDF Downloads 14
5172 Implementation of Statistical Parameters to Form an Entropic Mathematical Models

Authors: Gurcharan Singh Buttar

Abstract:

It has been discovered that although these two areas, statistics, and information theory, are independent in their nature, they can be combined to create applications in multidisciplinary mathematics. This is due to the fact that where in the field of statistics, statistical parameters (measures) play an essential role in reference to the population (distribution) under investigation. Information measure is crucial in the study of ambiguity, assortment, and unpredictability present in an array of phenomena. The following communication is a link between the two, and it has been demonstrated that the well-known conventional statistical measures can be used as a measure of information.

Keywords: probability distribution, entropy, concavity, symmetry, variance, central tendency

Procedia PDF Downloads 155
5171 Digital Architectural Practice as a Challenge for Digital Architectural Technology Elements in the Era of Digital Design

Authors: Ling Liyun

Abstract:

In the field of contemporary architecture, complex forms of architectural works continue to emerge in the world, along with some new terminology emerged: digital architecture, parametric design, algorithm generation, building information modeling, CNC construction and so on. Architects gradually mastered the new skills of mathematical logic in the form of exploration, virtual simulation, and the entire design and coordination in the construction process. Digital construction technology has a greater degree in controlling construction, and ensure its accuracy, creating a series of new construction techniques. As a result, the use of digital technology is an improvement and expansion of the practice of digital architecture design revolution. We worked by reading and analyzing information about the digital architecture development process, a large number of cases, as well as architectural design and construction as a whole process. Thus current developments were introduced and discussed in our paper, such as architectural discourse, design theory, digital design models and techniques, material selecting, as well as artificial intelligence space design. Our paper also pays attention to the representative three cases of digital design and construction experiment at great length in detail to expound high-informatization, high-reliability intelligence, and high-technique in constructing a humane space to cope with the rapid development of urbanization. We concluded that the opportunities and challenges of the shift existed in architectural paradigms, such as the cooperation methods, theories, models, technologies and techniques which were currently employed in digital design research and digital praxis. We also find out that the innovative use of space can gradually change the way people learn, talk, and control information. The past two decades, digital technology radically breaks the technology constraints of industrial technical products, digests the publicity on a particular architectural style (era doctrine). People should not adapt to the machine, but in turn, it’s better to make the machine work for users.

Keywords: artificial intelligence, collaboration, digital architecture, digital design theory, material selection, space construction

Procedia PDF Downloads 135
5170 Smooth Second Order Nonsingular Terminal Sliding Mode Control for a 6 DOF Quadrotor UAV

Authors: V. Tabrizi, A. Vali, R. GHasemi, V. Behnamgol

Abstract:

In this article, a nonlinear model of an under actuated six degrees of freedom (6 DOF) quadrotor UAV is derived on the basis of the Newton-Euler formula. The derivation comprises determining equations of the motion of the quadrotor in three dimensions and approximating the actuation forces through the modeling of aerodynamic coefficients and electric motor dynamics. The robust nonlinear control strategy includes a smooth second order non-singular terminal sliding mode control which is applied to stabilizing this model. The control method is on the basis of super twisting algorithm for removing the chattering and producing smooth control signal. Also, nonsingular terminal sliding mode idea is used for introducing a nonlinear sliding variable that guarantees the finite time convergence in sliding phase. Simulation results show that the proposed algorithm is robust against uncertainty or disturbance and guarantees a fast and precise control signal.

Keywords: quadrotor UAV, nonsingular terminal sliding mode, second order sliding mode t, electronics, control, signal processing

Procedia PDF Downloads 439
5169 Limits Problem Solving in Engineering Careers: Competences and Errors

Authors: Veronica Diaz Quezada

Abstract:

In this article, the performance and errors are featured and analysed in the limit problems solving of a real-valued function, in correspondence to competency-based education in engineering careers, in the south of Chile. The methodological component is contextualised in a qualitative research, with a descriptive and explorative design, with elaboration, content validation and application of quantitative instruments, consisting of two parallel forms of open answer tests, based on limit application problems. The mathematical competences and errors made by students from five engineering careers from a public University are identified and characterized. Results show better performance only to solve routine-context problem-solving competence, thus they are oriented towards a rational solution or they use a suitable problem-solving method, achieving the correct solution. Regarding errors, most of them are related to techniques and the incorrect use of theorems and definitions of real-valued function limits of real variable.

Keywords: engineering education, errors, limits, mathematics competences, problem solving

Procedia PDF Downloads 150
5168 Taylor’s Law and Relationship between Life Expectancy at Birth and Variance in Age at Death in Period Life Table

Authors: David A. Swanson, Lucky M. Tedrow

Abstract:

Taylor’s Law is a widely observed empirical pattern that relates variances to means in sets of non-negative measurements via an approximate power function, which has found application to human mortality. This study adds to this research by showing that Taylor’s Law leads to a model that reasonably describes the relationship between life expectancy at birth (e0, which also is equal to mean age at death in a life table) and variance at age of death in seven World Bank regional life tables measured at two points in time, 1970 and 2000. Using as a benchmark a non-random sample of four Japanese female life tables covering the period from 1950 to 2004, the study finds that the simple linear model provides reasonably accurate estimates of variance in age at death in a life table from e0, where the latter range from 60.9 to 85.59 years. Employing 2017 life tables from the Human Mortality Database, the simple linear model is used to provide estimates of variance at age in death for six countries, three of which have high e0 values and three of which have lower e0 values. The paper provides a substantive interpretation of Taylor’s Law relative to e0 and concludes by arguing that reasonably accurate estimates of variance in age at death in a period life table can be calculated using this approach, which also can be used where e0 itself is estimated rather than generated through the construction of a life table, a useful feature of the model.

Keywords: empirical pattern, mean age at death in a life table, mean age of a stationary population, stationary population

Procedia PDF Downloads 328
5167 Embedded System of Signal Processing on FPGA: Underwater Application Architecture

Authors: Abdelkader Elhanaoui, Mhamed Hadji, Rachid Skouri, Said Agounad

Abstract:

The purpose of this paper is to study the phenomenon of acoustic scattering by using a new method. The signal processing (Fast Fourier Transform FFT Inverse Fast Fourier Transform iFFT and BESSEL functions) is widely applied to obtain information with high precision accuracy. Signal processing has a wider implementation in general-purpose pro-cessors. Our interest was focused on the use of FPGAs (Field-Programmable Gate Ar-rays) in order to minimize the computational complexity in single processor architecture, then be accelerated on FPGA and meet real-time and energy efficiency requirements. Gen-eral-purpose processors are not efficient for signal processing. We implemented the acous-tic backscattered signal processing model on the Altera DE-SOC board and compared it to Odroid xu4. By comparison, the computing latency of Odroid xu4 and FPGA is 60 sec-onds and 3 seconds, respectively. The detailed SoC FPGA-based system has shown that acoustic spectra are performed up to 20 times faster than the Odroid xu4 implementation. FPGA-based system of processing algorithms is realized with an absolute error of about 10⁻³. This study underlines the increasing importance of embedded systems in underwater acoustics, especially in non-destructive testing. It is possible to obtain information related to the detection and characterization of submerged cells. So we have achieved good exper-imental results in real-time and energy efficiency.

Keywords: DE1 FPGA, acoustic scattering, form function, signal processing, non-destructive testing

Procedia PDF Downloads 75
5166 Application and Verification of Regression Model to Landslide Susceptibility Mapping

Authors: Masood Beheshtirad

Abstract:

Identification of regions having potential for landslide occurrence is one of the basic measures in natural resources management. Different landslide hazard mapping models are proposed based on the environmental condition and goals. In this research landslide hazard map using multiple regression model were provided and applicability of this model is investigated in Baghdasht watershed. Dependent variable is landslide inventory map and independent variables consist of information layers as Geology, slope, aspect, distance from river, distance from road, fault and land use. For doing this, existing landslides have been identified and an inventory map made. The landslide hazard map is based on the multiple regression provided. The level of similarity potential hazard classes and figures of this model were compared with the landslide inventory map in the SPSS environments. Results of research showed that there is a significant correlation between the potential hazard classes and figures with area of the landslides. The multiple regression model is suitable for application in the Baghdasht Watershed.

Keywords: landslide, mapping, multiple model, regression

Procedia PDF Downloads 322