Search results for: theory of critical distances
3387 Machine Learning for Rational Decision-Making: Introducing Creativity to Teachers within a School System
Authors: Larry Audet
Abstract:
Creativity is suddenly and fortunately a new educational focus in the United Arab Emirates and around the world. Yet still today many leaders of creativity are not sure how to introduce it to their teachers. It is impossible to simultaneously introduce every aspect of creativity into a work climate and reach any degree of organizational coherence. The number of alternatives to explore is so great; the information teachers need to learn is so vast, that even an approximation to including every concept and theory of creativity into the school organization is hard to conceive. Effective leaders of creativity need evidence-based and practical guidance for introducing and stimulating creativity in others. Machine learning models reveal new findings from KEYS Survey© data about teacher perceptions of stimulants and barriers to their individual and collective creativity. Findings from predictive and causal models provide leaders with a rational for decision-making when introducing creativity into their organization. Leaders should focus on management practices first. Analyses reveal that creative outcomes are more likely to occur when teachers perceive supportive management practices: providing teachers with challenging work that calls for their best efforts; allowing freedom and autonomy in their practice of work; allowing teachers to form creative work-groups; and, recognizing them for their efforts. Once management practices are in place, leaders should focus their efforts on modeling risk-taking, providing optimal amounts of preparation time, and evaluating teachers fairly.Keywords: creativity, leadership, KEYS survey, teaching, work climate
Procedia PDF Downloads 1663386 Mass Media and Electoral Conflict Management in Kogi State, Nigeria
Authors: Okpanachi Linus Odiji, Chris Ogwu Attah
Abstract:
Election is no doubt widely assumed as one of the most suitable means of resolving political quagmires even though it has never been bereft of conflict which can manifest before, during, or after polls. What, however, advances democracy and promotes electoral integrity is the existence and effectiveness of institutional frameworks for electoral conflict management. Electoral conflicts are no doubt unique in the sense that they represent the struggles of people over the control of public resources. In most cases, the stakes involved are high and emotional that they do not only undermine inter-group relationship but also threaten national security. The need, therefore, for an effectively functional conflict management apparatus becomes imperative. While at the State level, there exist numerous governmental initiatives at various electoral stages aimed at managing conflicts, this paper examines the activities of the mass media, which is another prominent stakeholder in the electoral process. Even though media influence has increased tremendously in the last decade, researchers are yet to agree on its utility in the management of conflicts. Guided by the social responsibility theory of media reporting and drawing data from observed trends in Kogi state, the paper, which context analyses the 2019 gubernatorial election coverage in the state, observes both conflict escalation and de-escalation roles in the media. To mitigate conflict reporting misrepresentation, therefore, a common approach to conflict reporting should be designed and ordered by the National Broadcasting Commission as well as the Nigerian Press Council. This should be garnished with the training of journalists on conflict reporting and development of a standard conflict reporting procedure.Keywords: conflict management, electoral conflict, mass media, media reporting
Procedia PDF Downloads 1503385 Processing Studies and Challenges Faced in Development of High-Pressure Titanium Alloy Cryogenic Gas Bottles
Authors: Bhanu Pant, Sanjay H. Upadhyay
Abstract:
Frequently, the upper stage of high-performance launch vehicles utilizes cryogenic tank-submerged pressurization gas bottles with high volume-to-weight efficiency to achieve a direct gain in the satellite payload. Titanium alloys, owing to their high specific strength coupled with excellent compatibility with various fluids, are the materials of choice for these applications. Amongst the Titanium alloys, there are two alloys suitable for cryogenic applications, namely Ti6Al4V-ELI and Ti5Al2.5Sn-ELI. The two-phase alpha-beta alloy Ti6Al4V-ELI is usable up to LOX temperature of 90K, while the single-phase alpha alloy Ti5Al2.5Sn-ELI can be used down to LHe temperature of 4 K. The high-pressure gas bottles submerged in the LH2 (20K) can store more amount of gas in as compared to those submerged in LOX (90K) bottles the same volume. Thus, the use of these alpha alloy gas bottles stored at 20K gives a distinct advantage with respect to the need for a lesser number of gas bottles to store the same amount of high-pressure gas, which in turn leads to a one-to-one advantage in the payload in the satellite. The cost advantage to the tune of 15000$/ kg of weight is saved in the upper stages, and, thereby, the satellite payload gain is expected by this change. However, the processing of alpha Ti5Al2.5Sn-ELI alloy gas bottles poses challenges due to the lower forgeability of the alloy and mode of qualification for the critical severe application environment. The present paper describes the processing and challenges/ solutions during the development of these advanced gas bottles for LH2 (20K) applications.Keywords: titanium alloys, cryogenic gas bottles, alpha titanium alloy, alpha-beta titanium alloy
Procedia PDF Downloads 583384 The Impacts of Cultural Differences on Consumer Behavior when Multinational Corporations Enter the Chinese Market
Authors: Xue Junwei
Abstract:
In the global economy, multinational corporations face challenges due to cultural differences impacting consumer behavior. Understanding these influences is vital for effective business decisions in the Chinese market. This study aims to analyze how cultural differences affect consumer behavior when multinational corporations enter the Chinese market, using cultural dimensions theory to derive marketing mix strategies. The study employs statistical analysis of cultural dimensions to investigate the impact on consumer behavior and derive marketing strategies for multinational corporations entering the Chinese market. Furthermore, this study enhances the study by incorporating qualitative data to complement the statistical analysis, providing a more comprehensive understanding of cultural impacts on consumer behavior. The study reveals significant implications of cultural differences on consumer behavior and provides insights into tailored marketing mix strategies for multinational corporations in the Chinese market. This research contributes to the theoretical understanding of how cultural dimensions influence consumer behavior and provides practical implications for multinational corporations entering the Chinese market. Data on cultural dimensions are collected and analyzed statistically and qualitatively to understand their impact on consumer behavior and derive effective marketing strategies. This study concludes that cultural differences have a profound impact on consumer behavior in the Chinese market, and understanding these nuances is crucial for the success of multinational corporations. Tailored marketing strategies are essential for navigating these cultural challenges.Keywords: marketing, multinational company, globalization, cultural differences
Procedia PDF Downloads 93383 Dissection of the Impact of Diabetes Type on Heart Failure across Age Groups: A Systematic Review of Publication Patterns on PubMed
Authors: Nazanin Ahmadi Daryakenari
Abstract:
Background: Diabetes significantly influences the risk of heart failure. The interplay between distinct types of diabetes, heart failure, and their distribution across various age groups remains an area of active exploration. This study endeavors to scrutinize the age group distribution in publications addressing Type 1 and Type 2 diabetes and heart failure on PubMed while also examining the evolving publication trends. Methods: We leveraged E-utilities and RegEx to search and extract publication data from PubMed using various mesh terms. Subsequently, we conducted descriptive statistics and t-tests to discern the differences between the two diabetes types and the distribution across age groups. Finally, we analyzed the temporal trends of publications concerning both types of diabetes and heart failure. Results: Our findings revealed a divergence in the age group distribution between Type 1 and Type 2 diabetes within heart failure publications. Publications discussing Type 2 diabetes and heart failure were more predominant among older age groups, whereas those addressing Type 1 diabetes and heart failure displayed a more balanced distribution across all age groups. The t-test revealed no significant difference in the means between the two diabetes types. However, the number of publications exploring the relationship between Type 2 diabetes and heart failure has seen a steady increase over time, suggesting an escalating interest in this area. Conclusion: The dissection of publication patterns on PubMed uncovers a pronounced association between Type 2 diabetes and heart failure within older age groups. This highlights the critical need to comprehend the distinct age group differences when examining diabetes and heart failure to inform and refine targeted prevention and treatment strategies.Keywords: Type 1 diabetes, Type 2 diabetes, heart failure, age groups, publication patterns, PubMed
Procedia PDF Downloads 953382 Sustainable Management Practices of International Construction Joint Ventures: A Conceptual Model for Managing Barriers and Risks
Authors: Mershack O. Tetteh, Albert P. C. Chan, Amos Darko, Gabriel Nani
Abstract:
International construction joint ventures (ICJVs) have evolved as an effective approach to sustainable development, given their myriad socio-economic and environmental benefits. Yet, they are not free of barriers and risks. In many studies, it is termed as risks for convenience’s sake. While the barriers and risks continue to affect the success of ICJVs, a systematic and reliable approach for managing them has yet to be developed. This study aims to identify and classify the barriers and risks factors affecting ICJVs through a systematic literature review. Based on a critical review of 54 papers published in peer-reviewed journals from 1990 to 2019, a conceptual framework was proposed for managing the barriers and risks in ICJV operations. The review showed that the barriers can be grouped into six including inter-organizational differences, lack of expertise and confidence, lack of effective planning and strategies, lack of knowledge of ICJV’s fundamentals, conflicts among ICJV entities, and management difficulties. The risks were also categorized into six: policy and political risks, legal risks, financial risks, management risks, project and technical risks, and market risks. The developed model would help practitioners achieve more efficient resource allocation and bring new perspectives for managerial practices in ICJVs. Moreover, it is positioned to alleviate the negligence of previous studies that combined the barriers and risks factors as one checklist.Keywords: barriers, construction, international construction joint venture, risks, sustainable development
Procedia PDF Downloads 2603381 Factors Influencing Soil Organic Carbon Storage Estimation in Agricultural Soils: A Machine Learning Approach Using Remote Sensing Data Integration
Authors: O. Sunantha, S. Zhenfeng, S. Phattraporn, A. Zeeshan
Abstract:
The decline of soil organic carbon (SOC) in global agriculture is a critical issue requiring rapid and accurate estimation for informed policymaking. While it is recognized that SOC predictors vary significantly when derived from remote sensing data and environmental variables, identifying the specific parameters most suitable for accurately estimating SOC in diverse agricultural areas remains a challenge. This study utilizes remote sensing data to precisely estimate SOC and identify influential factors in diverse agricultural areas, such as paddy, corn, sugarcane, cassava, and perennial crops. Extreme gradient boosting (XGBoost), random forest (RF), and support vector regression (SVR) models are employed to analyze these factors' impact on SOC estimation. The results show key factors influencing SOC estimation include slope, vegetation indices (EVI), spectral reflectance indices (red index, red edge2), temperature, land use, and surface soil moisture, as indicated by their averaged importance scores across XGBoost, RF, and SVR models. Therefore, using different machine learning algorithms for SOC estimation reveals varying influential factors from remote sensing data and environmental variables. This approach emphasizes feature selection, as different machine learning algorithms identify various key factors from remote sensing data and environmental variables for accurate SOC estimation.Keywords: factors influencing SOC estimation, remote sensing data, environmental variables, machine learning
Procedia PDF Downloads 353380 Innovative Approaches to Water Resources Management: Addressing Challenges through Machine Learning and Remote Sensing
Authors: Abdelrahman Elsehsah, Abdelazim Negm, Eid Ashour, Mohamed Elsahabi
Abstract:
Water resources management is a critical field that encompasses the planning, development, conservation, and allocation of water resources to meet societal needs while ensuring environmental sustainability. This paper reviews the key concepts and challenges in water resources management, emphasizing the significance of a holistic approach that integrates social, economic, and environmental factors. Traditional water management practices, characterized by supply-oriented strategies and centralized control, are increasingly inadequate in addressing contemporary challenges such as water scarcity, climate change impacts, and ecosystem degradation. Emerging technologies, particularly machine learning and remote sensing, offer innovative solutions to enhance decision-making processes in water management. Machine learning algorithms facilitate accurate water demand forecasting, quality monitoring, and leak detection, while remote sensing technologies provide vital data for assessing water availability and quality. This review highlights the need for integrated water management strategies that leverage these technologies to promote sustainable practices and foster resilience in water systems. Future research should focus on improving data quality, accessibility, and the integration of diverse datasets to optimize the benefits of these technological advancements.Keywords: water resources management, water scarcity, climate change, machine learning, remote sensing, water quality, water governance, sustainable practices, ecosystem management
Procedia PDF Downloads 93379 Dispersions of Carbon Black in Microemulsions
Authors: Mohamed Youssry, Dominique Guyomard, Bernard Lestriez
Abstract:
In order to enhance the energy and power densities of electrodes for energy storage systems, the formulation and processing of electrode slurries proved to be a critical issue in determining the electrode performance. In this study, we introduce novel approach to formulate carbon black slurries based on microemulsion and lyotropic liquid crystalline phases (namely, lamellar phase) composed of non-ionic surfactant (Triton X100), decanol and water. Simultaneous measurements of electrical properties of slurries under shear flow (rheology) have been conducted to elucidate the microstructure evolution with the surfactant concentration and decanol/water ratio at rest, as well as, the structural transition under steady-shear which has been confirmed by rheo-microscopy. Interestingly, the carbon black slurries at low decanol/water ratio are weak-gel (flowable) with higher electrical conductivity than those at higher ratio which behave strong-gel viscoelastic response. In addition, the slurries show recoverable electrical behaviour under shear flow in tandem with the viscosity trend. It is likely that oil-in-water microemulsion enhances slurries’ stability without affecting on the percolating network of carbon black. On the other hand, the oil-in-water analogous and bilayer structure of lamellar phase cause the slurries less conductive as a consequence of losing the network percolation. These findings are encouraging to formulate microemulsion-based electrodes for energy storage system (lithium-ion batteries).Keywords: electrode slurries, microemulsion, microstructure transition, rheo-electrical properties
Procedia PDF Downloads 2663378 Optimization of the Mechanical Performance of Fused Filament Fabrication Parts
Authors: Iván Rivet, Narges Dialami, Miguel Cervera, Michele Chiumenti
Abstract:
Process parameters in Additive Manufacturing (AM) play a critical role in the mechanical performance of the final component. In order to find the input configuration that guarantees the optimal performance of the printed part, the process-performance relationship must be found. Fused Filament Fabrication (FFF) is the selected demonstrative AM technology due to its great popularity in the industrial manufacturing world. A material model that considers the different printing patterns present in a FFF part is used. A voxelized mesh is built from the manufacturing toolpaths described in the G-Code file. An Adaptive Mesh Refinement (AMR) based on the octree strategy is used in order to reduce the complexity of the mesh while maintaining its accuracy. High-fidelity and cost-efficient Finite Element (FE) simulations are performed and the influence of key process parameters in the mechanical performance of the component is analyzed. A robust optimization process based on appropriate failure criteria is developed to find the printing direction that leads to the optimal mechanical performance of the component. The Tsai-Wu failure criterion is implemented due to the orthotropy and heterogeneity constitutive nature of FFF components and because of the differences between the strengths in tension and compression. The optimization loop implements a modified version of an Anomaly Detection (AD) algorithm and uses the computed metrics to obtain the optimal printing direction. The developed methodology is verified with a case study on an industrial demonstrator.Keywords: additive manufacturing, optimization, printing direction, mechanical performance, voxelization
Procedia PDF Downloads 633377 Optimization of Fused Deposition Modeling 3D Printing Process via Preprocess Calibration Routine Using Low-Cost Thermal Sensing
Authors: Raz Flieshman, Adam Michael Altenbuchner, Jörg Krüger
Abstract:
This paper presents an approach to optimizing the Fused Deposition Modeling (FDM) 3D printing process through a preprocess calibration routine of printing parameters. The core of this method involves the use of a low-cost thermal sensor capable of measuring tempera-tures within the range of -20 to 500 degrees Celsius for detailed process observation. The calibration process is conducted by printing a predetermined path while varying the process parameters through machine instructions (g-code). This enables the extraction of critical thermal, dimensional, and surface properties along the printed path. The calibration routine utilizes computer vision models to extract features and metrics from the thermal images, in-cluding temperature distribution, layer adhesion quality, surface roughness, and dimension-al accuracy and consistency. These extracted properties are then analyzed to optimize the process parameters to achieve the desired qualities of the printed material. A significant benefit of this calibration method is its potential to create printing parameter profiles for new polymer and composite materials, thereby enhancing the versatility and application range of FDM 3D printing. The proposed method demonstrates significant potential in enhancing the precision and reliability of FDM 3D printing, making it a valuable contribution to the field of additive manufacturing.Keywords: FDM 3D printing, preprocess calibration, thermal sensor, process optimization, additive manufacturing, computer vision, material profiles
Procedia PDF Downloads 413376 Probabilistic Models to Evaluate Seismic Liquefaction In Gravelly Soil Using Dynamic Penetration Test and Shear Wave Velocity
Authors: Nima Pirhadi, Shao Yong Bo, Xusheng Wan, Jianguo Lu, Jilei Hu
Abstract:
Although gravels and gravelly soils are assumed to be non-liquefiable because of high conductivity and small modulus; however, the occurrence of this phenomenon in some historical earthquakes, especially recently earthquakes during 2008 Wenchuan, Mw= 7.9, 2014 Cephalonia, Greece, Mw= 6.1 and 2016, Kaikoura, New Zealand, Mw = 7.8, has been promoted the essential consideration to evaluate risk assessment and hazard analysis of seismic gravelly soil liquefaction. Due to the limitation in sampling and laboratory testing of this type of soil, in situ tests and site exploration of case histories are the most accepted procedures. Of all in situ tests, dynamic penetration test (DPT), Which is well known as the Chinese dynamic penetration test, and shear wave velocity (Vs) test, have been demonstrated high performance to evaluate seismic gravelly soil liquefaction. However, the lack of a sufficient number of case histories provides an essential limitation for developing new models. This study at first investigates recent earthquakes that caused liquefaction in gravelly soils to collect new data. Then, it adds these data to the available literature’s dataset to extend them and finally develops new models to assess seismic gravelly soil liquefaction. To validate the presented models, their results are compared to extra available models. The results show the reasonable performance of the proposed models and the critical effect of gravel content (GC)% on the assessment.Keywords: liquefaction, gravel, dynamic penetration test, shear wave velocity
Procedia PDF Downloads 2013375 Ranking of the Main Criteria for Contractor Selection Procedures on Major Construction Projects in Libya Using the Delphi Method
Authors: Othoman Elsayah, Naren Gupta, Binsheng Zhang
Abstract:
The construction sector constitutes one of the most important sectors in the economy of any country. Contractor selection is a critical decision that is undertaken by client organizations and is central to the success of any construction project. Contractor selection (CS) is a process which involves investigating, screening and determining whether candidate contractors have the technical and financial capability to be accepted to formally tender for construction work. The process should be conducted prior to the award of contract, characterized by many factors such as: contactor’s skills, experience on similar projects, track- record in the industry, and financial stability. However, this paper evaluates the current state of knowledge in relation to contractor selection process and demonstrates the findings from the analysis of the data collected from the Delphi questionnaire survey. The survey was conducted with a group of 12 experts working in the Libyan construction industry (LCI). The paper starts by briefly explaining the general outline of the questionnaire including the survey participation rate, the different fields the experts came from, and the business titles of the participants. Then, the paper describes the tests used to determine when the experts had reached consensus. The paper is based on research which aims to develop rank contractor selection criteria with specific application to make construction projects in the Libyan context. The findings of this study will be utilized to establish the scope of work that will be used as part of a PhD research.Keywords: contractor selection, Libyan construction industry, decision experts, Delphi technique
Procedia PDF Downloads 3323374 Breaking the Barrier of Service Hostility: A Lean Approach to Achieve Operational Excellence
Authors: Mofizul Islam Awwal
Abstract:
Due to globalization, industries are rapidly growing throughout the world which leads to many manufacturing organizations. But recently, service industries are beginning to emerge in large numbers almost in all parts of the world including some developing countries. In this context, organizations need to have strong competitive advantage over their rivals to achieve their strategic business goals. Manufacturing industries are adopting many methods and techniques in order to achieve such competitive edge. Over the last decades, manufacturing industries have been successfully practicing lean concept to optimize their production lines. Due to its huge success in manufacturing context, lean has made its way into the service industry. Very little importance has been addressed to service in the area of operations management. Service industries are far behind than manufacturing industries in terms of operations improvement. It will be a hectic job to transfer the lean concept from production floor to service back/front office which will obviously yield possible improvement. Service processes are not as visible as production processes and can be very complex. Lack of research in this area made it quite difficult for service industries as there are no standardized frameworks for successfully implementing lean concept in service organization. The purpose of this research paper is to capture the present scenario of service industry in terms of lean implementation. Thorough analysis of past literature will be done on the applicability and understanding of lean in service structure. Classification of research papers will be done and critical factors will be unveiled for implementing lean in service industry to achieve operational excellence.Keywords: lean service, lean literature classification, lean implementation, service industry, service excellence
Procedia PDF Downloads 3753373 Partnership Oriented Innovation Alliance Strategy Based on Market Feedback
Authors: Victor Romanov, Daria Efimenko
Abstract:
The focus on innovation in modern economy is the main factor in surviving business in a competitive environment. The innovations are based on the search and use of knowledge in a global context. Nowadays consumers and market demand are the main innovation drivers. This leads to build a business as a system with feedback, promptly restructuring production and innovation implementation in response to market demands. In modern knowledge economy, because of speed of technical progress, the product's lifecycle became much shorter, what makes more stringent requirements for innovation implementation on the enterprises of and therefore the possibility for enterprise for receiving extra income is decreasing. This circumstance imposes additional requirements for the replacement of obsolete products and the prompt release of innovative products to the market. The development of information technologies has led to the fact that only in the conditions of partnership and knowledge sharing with partners it is possible to update products quickly for innovative products. Many companies pay attention to updating innovations through the search for new partners, but the task of finding new partners presents some difficulties. The search for a suitable one includes several stages such as: determining the moment of innovation-critical, introducing a search, identifying search criteria, justifying and deciding on the choice of a partner. No less important is the question of how to manage an innovative product in response to a changing market. The article considers the problems of information support for the search for the source of innovation and partnership to decrease the time for implementation of novelty products.Keywords: partnership, novelty, market feedback, alliance
Procedia PDF Downloads 1943372 Springback Prediction for Sheet Metal Cold Stamping Using Convolutional Neural Networks
Abstract:
Cold stamping has been widely applied in the automotive industry for the mass production of a great range of automotive panels. Predicting the springback to ensure the dimensional accuracy of the cold-stamped components is a critical step. The main approaches for the prediction and compensation of springback in cold stamping include running Finite Element (FE) simulations and conducting experiments, which require forming process expertise and can be time-consuming and expensive for the design of cold stamping tools. Machine learning technologies have been proven and successfully applied in learning complex system behaviours using presentative samples. These technologies exhibit the promising potential to be used as supporting design tools for metal forming technologies. This study, for the first time, presents a novel application of a Convolutional Neural Network (CNN) based surrogate model to predict the springback fields for variable U-shape cold bending geometries. A dataset is created based on the U-shape cold bending geometries and the corresponding FE simulations results. The dataset is then applied to train the CNN surrogate model. The result shows that the surrogate model can achieve near indistinguishable full-field predictions in real-time when compared with the FE simulation results. The application of CNN in efficient springback prediction can be adopted in industrial settings to aid both conceptual and final component designs for designers without having manufacturing knowledge.Keywords: springback, cold stamping, convolutional neural networks, machine learning
Procedia PDF Downloads 1493371 An Application of Integrated Multi-Objective Particles Swarm Optimization and Genetic Algorithm Metaheuristic through Fuzzy Logic for Optimization of Vehicle Routing Problems in Sugar Industry
Authors: Mukhtiar Singh, Sumeet Nagar
Abstract:
Vehicle routing problem (VRP) is a combinatorial optimization and nonlinear programming problem aiming to optimize decisions regarding given set of routes for a fleet of vehicles in order to provide cost-effective and efficient delivery of both services and goods to the intended customers. This paper proposes the application of integrated particle swarm optimization (PSO) and genetic optimization algorithm (GA) to address the Vehicle routing problem in sugarcane industry in India. Suger industry is very prominent agro-based industry in India due to its impacts on rural livelihood and estimated to be employing around 5 lakhs workers directly in sugar mills. Due to various inadequacies, inefficiencies and inappropriateness associated with the current vehicle routing model it costs huge money loss to the industry which needs to be addressed in proper context. The proposed algorithm utilizes the crossover operation that originally appears in genetic algorithm (GA) to improve its flexibility and manipulation more readily and avoid being trapped in local optimum, and simultaneously for improving the convergence speed of the algorithm, level set theory is also added to it. We employ the hybrid approach to an example of VRP and compare its result with those generated by PSO, GA, and parallel PSO algorithms. The experimental comparison results indicate that the performance of hybrid algorithm is superior to others, and it will become an effective approach for solving discrete combinatory problems.Keywords: fuzzy logic, genetic algorithm, particle swarm optimization, vehicle routing problem
Procedia PDF Downloads 3943370 The Threat Posed by Dominant Languages to Minor Languages or Dialects: The Case of isiZulu and isiBhaca in Umzimkhulu, KwaZulu-Natal
Authors: Yanga Lusanda Praiseworth Majola
Abstract:
The small town of Umzimkhulu is situated in the KwaZulu-Natal province of South Africa and was once the Bantustan of Transkei. Citizens of Umzimkulu are called amaBhaca because they speak isiBhaca, which is a non-standard language but is mutually intelligible to three standard official languages, isiXhosa, isiZulu, and siSwati. Since Umzimkhulu was under the Eastern Cape Province prior to 2006, isiXhosa is used for official purposes, particularly in schools, then isiZulu is used in other sectors; this is despite the fact that the majority of Umzimkhulu citizens regard themselves as amaBhaca. This poses a threat to both isiBhaca as a language and the identity of amaBhaca because Umzimkhulu is situated in KZN, where isiZulu is the dominant language spoken by the majority in the province. The primary objective of this study is to unveil, using the language dominance theory, how dominant languages pose a threat to minority and developing languages or dialects. The study employed a mixed-methods approach. Data was obtained from key community members and leaders who were identified as amaBhaca, who have lived in Umzimkhulu their whole lives. The main findings of the study are that although isiBhaca is classified as a dialect of isiXhosa, linguistically, it is closer to isiZulu, and thus isiZulu poses much threat to the existence of isiBhaca since it becomes easy for amaBhaca to switch from isiBhaca to isiZulu and end up not having an interest in isiBhaca. Respondents revealed that in their view, isiBhaca is a language of its own, and the continuous use and empowerment of isiZulu in Umzimkhulu, particularly in the professional settings, is detrimental to isiBhaca, and this subsequently has the potential of endangering the existence of isiBhaca and might lead to its attrition.Keywords: language dominance, dominant languages, minority languages, language attrition
Procedia PDF Downloads 873369 A Systematic Mapping of the Use of Information and Communication Technology (ICT)-Based Remote Agricultural Extension for Women Smallholders
Authors: Busiswa Madikazi
Abstract:
This systematic mapping study explores the underrepresentation of women's contributions to farming in the Global South within the development of Information and Communication Technologies (ICT)-based extension methods. Despite women farmers constituting 70% of the agricultural labour force, their productivity is hindered by various constraints, including illiteracy, household commitments, and limited access to credit and markets. A systematic mapping approach was employed with the aim of identifying evidence gaps in existing ICT extension for women farmers. The data collection protocol follows a structured approach, incorporating key criteria for inclusion, exclusion, search strategy, and coding and the PICO strategy (Population, Intervention, Comparator, and Outcome). The results yielded 119 articles that qualified for inclusion. The findings highlight that mobile phone apps (WhatsApp) and radio/television programming are the primary extension methods employed while integrating ICT with training, field visits, and demonstrations are underutilized. Notably, the study emphasizes the inadequate attention to critical issues such as food security, gender equality, and attracting youth to farming within ICT extension efforts. These findings indicate a significant policy and practice gap, neglecting community-driven approaches that cater to women's specific needs and enhance their agricultural production. Map highlights the importance of refocusing ICT extension efforts to address women farmers’ unique challenges, thereby contributing to their empowerment and improving agricultural practices.Keywords: agricultural extension, ICT, women farmers, smallholders
Procedia PDF Downloads 633368 Predictive Analysis of Chest X-rays Using NLP and Large Language Models with the Indiana University Dataset and Random Forest Classifier
Authors: Azita Ramezani, Ghazal Mashhadiagha, Bahareh Sanabakhsh
Abstract:
This study researches the combination of Random. Forest classifiers with large language models (LLMs) and natural language processing (NLP) to improve diagnostic accuracy in chest X-ray analysis using the Indiana University dataset. Utilizing advanced NLP techniques, the research preprocesses textual data from radiological reports to extract key features, which are then merged with image-derived data. This improved dataset is analyzed with Random Forest classifiers to predict specific clinical results, focusing on the identification of health issues and the estimation of case urgency. The findings reveal that the combination of NLP, LLMs, and machine learning not only increases diagnostic precision but also reliability, especially in quickly identifying critical conditions. Achieving an accuracy of 99.35%, the model shows significant advancements over conventional diagnostic techniques. The results emphasize the large potential of machine learning in medical imaging, suggesting that these technologies could greatly enhance clinician judgment and patient outcomes by offering quicker and more precise diagnostic approximations.Keywords: natural language processing (NLP), large language models (LLMs), random forest classifier, chest x-ray analysis, medical imaging, diagnostic accuracy, indiana university dataset, machine learning in healthcare, predictive modeling, clinical decision support systems
Procedia PDF Downloads 463367 Advantages of Multispectral Imaging for Accurate Gas Temperature Profile Retrieval from Fire Combustion Reactions
Authors: Jean-Philippe Gagnon, Benjamin Saute, Stéphane Boubanga-Tombet
Abstract:
Infrared thermal imaging is used for a wide range of applications, especially in the combustion domain. However, it is well known that most combustion gases such as carbon dioxide (CO₂), water vapor (H₂O), and carbon monoxide (CO) selectively absorb/emit infrared radiation at discrete energies, i.e., over a very narrow spectral range. Therefore, temperature profiles of most combustion processes derived from conventional broadband imaging are inaccurate without prior knowledge or assumptions about the spectral emissivity properties of the combustion gases. Using spectral filters allows estimating these critical emissivity parameters in addition to providing selectivity regarding the chemical nature of the combustion gases. However, due to the turbulent nature of most flames, it is crucial that such information be obtained without sacrificing temporal resolution. For this reason, Telops has developed a time-resolved multispectral imaging system which combines a high-performance broadband camera synchronized with a rotating spectral filter wheel. In order to illustrate the benefits of using this system to characterize combustion experiments, measurements were carried out using a Telops MS-IR MW on a very simple combustion system: a wood fire. The temperature profiles calculated using the spectral information from the different channels were compared with corresponding temperature profiles obtained with conventional broadband imaging. The results illustrate the benefits of the Telops MS-IR cameras for the characterization of laminar and turbulent combustion systems at a high temporal resolution.Keywords: infrared, multispectral, fire, broadband, gas temperature, IR camera
Procedia PDF Downloads 1433366 Intonation Salience as an Underframe to Text Intonation Models
Authors: Tatiana Stanchuliak
Abstract:
It is common knowledge that intonation is not laid over a ready text. On the contrary, intonation forms and accompanies the text on the level of its birth in the speaker’s mind. As a result, intonation plays one of the fundamental roles in the process of transferring a thought into external speech. Intonation structure can highlight the semantic significance of textual elements and become a ranging mark in understanding the information structure of the text. Intonation functions by means of prosodic characteristics, one of which is intonation salience, whose function in texts results in making some textual elements more prominent than others. This function of intonation, therefore, performs as organizing. It helps to form the frame of key elements of the text. The study under consideration made an attempt to look into the inner nature of salience and create a sort of a text intonation model. This general goal brought to some more specific intermediate results. First, there were established degrees of salience on the level of the smallest semantic element - intonation group, as well as prosodic means of creating salience, were examined. Second, the most frequent combinations of prosodic means made it possible to distinguish patterns of salience, which then became constituent elements of a text intonation model. Third, the analysis of the predicate structure allowed to divide the whole text into smaller parts, or units, which performed a specific function in the developing of the general communicative intention. It appeared that such units can be found in any text and they have common characteristics of their intonation arrangement. These findings are certainly very important both for the theory of intonation and their practical application.Keywords: accentuation , inner speech, intention, intonation, intonation functions, models, patterns, predicate, salience, semantics, sentence stress, text
Procedia PDF Downloads 2663365 Enhancing Quality Management Systems through Automated Controls and Neural Networks
Authors: Shara Toibayeva, Irbulat Utepbergenov, Lyazzat Issabekova, Aidana Bodesova
Abstract:
The article discusses the importance of quality assessment as a strategic tool in business and emphasizes the significance of the effectiveness of quality management systems (QMS) for enterprises. The evaluation of these systems takes into account the specificity of quality indicators, the multilevel nature of the system, and the need for optimal selection of the number of indicators and evaluation of the system state, which is critical for making rational management decisions. Methods and models of automated enterprise quality management are proposed, including an intelligent automated quality management system integrated with the Management Information and Control System. These systems make it possible to automate the implementation and support of QMS, increasing the validity, efficiency, and effectiveness of management decisions by automating the functions performed by decision makers and personnel. The paper also emphasizes the use of recurrent neural networks to improve automated quality management. Recurrent neural networks (RNNs) are used to analyze and process sequences of data, which is particularly useful in the context of document quality assessment and non-conformance detection in quality management systems. These networks are able to account for temporal dependencies and complex relationships between different data elements, which improves the accuracy and efficiency of automated decisions. The project was supported by a grant from the Ministry of Education and Science of the Republic of Kazakhstan under the Zhas Galym project No. AR 13268939, dedicated to research and development of digital technologies to ensure consistency of QMS regulatory documents.Keywords: automated control system, quality management, document structure, formal language
Procedia PDF Downloads 393364 Global Low Carbon Transitions in the Power Sector: A Machine Learning Archetypical Clustering Approach
Authors: Abdullah Alotaiq, David Wallom, Malcolm McCulloch
Abstract:
This study presents an archetype-based approach to designing effective strategies for low-carbon transitions in the power sector. To achieve global energy transition goals, a renewable energy transition is critical, and understanding diverse energy landscapes across different countries is essential to design effective renewable energy policies and strategies. Using a clustering approach, this study identifies 12 energy archetypes based on the electricity mix, socio-economic indicators, and renewable energy contribution potential of 187 UN countries. Each archetype is characterized by distinct challenges and opportunities, ranging from high dependence on fossil fuels to low electricity access, low economic growth, and insufficient contribution potential of renewables. Archetype A, for instance, consists of countries with low electricity access, high poverty rates, and limited power infrastructure, while Archetype J comprises developed countries with high electricity demand and installed renewables. The study findings have significant implications for renewable energy policymaking and investment decisions, with policymakers and investors able to use the archetype approach to identify suitable renewable energy policies and measures and assess renewable energy potential and risks. Overall, the archetype approach provides a comprehensive framework for understanding diverse energy landscapes and accelerating decarbonisation of the power sector.Keywords: fossil fuels, power plants, energy transition, renewable energy, archetypes
Procedia PDF Downloads 513363 Study of the Persian Gulf’s and Oman Sea’s Numerical Tidal Currents
Authors: Fatemeh Sadat Sharifi
Abstract:
In this research, a barotropic model was employed to consider the tidal studies in the Persian Gulf and Oman Sea, where the only sufficient force was the tidal force. To do that, a finite-difference, free-surface model called Regional Ocean Modeling System (ROMS), was employed on the data over the Persian Gulf and Oman Sea. To analyze flow patterns of the region, the results of limited size model of The Finite Volume Community Ocean Model (FVCOM) were appropriated. The two points were determined since both are one of the most critical water body in case of the economy, biology, fishery, Shipping, navigation, and petroleum extraction. The OSU Tidal Prediction Software (OTPS) tide and observation data validated the modeled result. Next, tidal elevation and speed, and tidal analysis were interpreted. Preliminary results determine a significant accuracy in the tidal height compared with observation and OTPS data, declaring that tidal currents are highest in Hormuz Strait and the narrow and shallow region between Iranian coasts and Islands. Furthermore, tidal analysis clarifies that the M_2 component has the most significant value. Finally, the Persian Gulf tidal currents are divided into two branches: the first branch converts from south to Qatar and via United Arab Emirate rotates to Hormuz Strait. The secondary branch, in north and west, extends up to the highest point in the Persian Gulf and in the head of Gulf turns counterclockwise.Keywords: numerical model, barotropic tide, tidal currents, OSU tidal prediction software, OTPS
Procedia PDF Downloads 1313362 How Reverse Logistics Can Improve the Sustainability Performance of a Business?
Authors: Taknaz Banihashemi, Jiangang Fei, Peggy Shu-Ling Chen
Abstract:
Reverse logistics (RL) is a part of the logistics of companies and its aim is to reclaim value from the returned products in an environmentally friendly manner. In recent years, RL has attracted significant attention among both practitioners and academics due to environmental directives and governmental legislation, consumer concerns and social responsibilities for environment, awareness of the limits of natural resources and economic potential. Sustainability development is considered as a critical goal for organisations due to its impact on competitive advantage. With growing environmental concerns and legal regulations related to green and sustainability issues, product disposition through RL can be considered as an environmental, economic and social sound way to achieve sustainable development. When employed properly, RL can help firms to improve their sustainability performance. The aim of this paper is to investigate the sustainability issues in the context of RL in the perspective of the triple-bottom-line approach. Content analysis was used to collect the information. The findings show that there is a research gap to investigate the relationship between RL and sustainability performance. Most of the studies have focused on performance evaluation of RL by considering the factors related to economic and environmental performance. RL can have significant effects on social issues along with economic and environmental issues. The inclusion of the social aspect in the sustainability performance will provide a complete and holistic picture of how RL may impact on the sustainability performance of firms. Generally, there is a lack of research on investigating the relationship between RL and sustainability by integrating the three pillars of triple-bottom-line sustainability performance. This paper provides academics and researchers a broad view of the correlations between RL and sustainability performance.Keywords: verse Logistics, review, sustainability, sustainability performance
Procedia PDF Downloads 1543361 Ipsilateral Heterotopic Ossification in the Knee and Shoulder Post Long COVID-19
Authors: Raheel Shakoor Siddiqui, Calvin Mathias, Manikandar Srinivas Cheruvu, Bobin Varghese
Abstract:
A 58 year old gentleman presented to accident and emergency at the district general hospital with worsening shortness of breath and a non-productive cough over a period of five days. He was initially admitted under the medical team for suspicion of SARS-CoV-2 (COVID-19) pneumonitis. Subsequently, upon deterioration of observations and a positive COVID-19 PCR, he was taken to intensive care for invasive mechanical ventilation. He required frequent proning, inotropic support and was intubated for thirty-three days. After successful extubation, he developed myopathy with a limited range of motion to his right knee and right shoulder. Plain film imaging of these limbs demonstrated an unusual formation of heterotopic ossification without any precipitating trauma or surgery. Current literature demonstrates limited case series portraying heterotopic ossification post-COVID-19. There has been negligible evidence of heterotopic ossification in the ipsilateral knee and shoulder post-prolonged immobility secondary to a critical illness. Physiotherapy and rehabilitation are post-intensive care can be prolonged due to the formation of heterotopic ossification around joints. Prolonged hospital stays may lead to a higher risk of developing infections of the chest, urine and pressure sores. This raises the question of whether a severe systemic inflammatory immune response from the SARS-CoV-2 virus results in histopathological processes leading to the formation of heterotopic ossification not previously seen, requiring prolonged physiotherapy.Keywords: orthopaedics, rehabilitation, physiotherapy, heterotopic ossification, COVID-19
Procedia PDF Downloads 713360 Revolutionizing Financial Forecasts: Enhancing Predictions with Graph Convolutional Networks (GCN) - Long Short-Term Memory (LSTM) Fusion
Authors: Ali Kazemi
Abstract:
Those within the volatile and interconnected international economic markets, appropriately predicting market trends, hold substantial fees for traders and financial establishments. Traditional device mastering strategies have made full-size strides in forecasting marketplace movements; however, monetary data's complicated and networked nature calls for extra sophisticated processes. This observation offers a groundbreaking method for monetary marketplace prediction that leverages the synergistic capability of Graph Convolutional Networks (GCNs) and Long Short-Term Memory (LSTM) networks. Our suggested algorithm is meticulously designed to forecast the traits of inventory market indices and cryptocurrency costs, utilizing a comprehensive dataset spanning from January 1, 2015, to December 31, 2023. This era, marked by sizable volatility and transformation in financial markets, affords a solid basis for schooling and checking out our predictive version. Our algorithm integrates diverse facts to construct a dynamic economic graph that correctly reflects market intricacies. We meticulously collect opening, closing, and high and low costs daily for key inventory marketplace indices (e.g., S&P 500, NASDAQ) and widespread cryptocurrencies (e.g., Bitcoin, Ethereum), ensuring a holistic view of marketplace traits. Daily trading volumes are also incorporated to seize marketplace pastime and liquidity, providing critical insights into the market's shopping for and selling dynamics. Furthermore, recognizing the profound influence of the monetary surroundings on financial markets, we integrate critical macroeconomic signs with hobby fees, inflation rates, GDP increase, and unemployment costs into our model. Our GCN algorithm is adept at learning the relational patterns amongst specific financial devices represented as nodes in a comprehensive market graph. Edges in this graph encapsulate the relationships based totally on co-movement styles and sentiment correlations, enabling our version to grasp the complicated community of influences governing marketplace moves. Complementing this, our LSTM algorithm is trained on sequences of the spatial-temporal illustration discovered through the GCN, enriched with historic fee and extent records. This lets the LSTM seize and expect temporal marketplace developments accurately. Inside the complete assessment of our GCN-LSTM algorithm across the inventory marketplace and cryptocurrency datasets, the version confirmed advanced predictive accuracy and profitability compared to conventional and opportunity machine learning to know benchmarks. Specifically, the model performed a Mean Absolute Error (MAE) of 0.85%, indicating high precision in predicting day-by-day charge movements. The RMSE was recorded at 1.2%, underscoring the model's effectiveness in minimizing tremendous prediction mistakes, which is vital in volatile markets. Furthermore, when assessing the model's predictive performance on directional market movements, it achieved an accuracy rate of 78%, significantly outperforming the benchmark models, averaging an accuracy of 65%. This high degree of accuracy is instrumental for techniques that predict the course of price moves. This study showcases the efficacy of mixing graph-based totally and sequential deep learning knowledge in economic marketplace prediction and highlights the fee of a comprehensive, records-pushed evaluation framework. Our findings promise to revolutionize investment techniques and hazard management practices, offering investors and economic analysts a powerful device to navigate the complexities of cutting-edge economic markets.Keywords: financial market prediction, graph convolutional networks (GCNs), long short-term memory (LSTM), cryptocurrency forecasting
Procedia PDF Downloads 663359 Politicization of Humanitarian NGOs: A Comparison Study of Oxfam and Médecins Sans Frontières (MSF)
Authors: Ratih Andaruni Widhiantari
Abstract:
The combination of the expanding population of aid agencies and the act of politicization in humanitarian intervention blurred the distinction between what humanitarianism accept as universal human rights in theory and their practices in humanitarian intervention. Humanitarian organizations were now venturing into the formerly taboo territory of politics that place individuals at risk, for examples, cooperating and coordinating with the intervening states, considering moments of destruction as opportunities for political change and even taking on functions that had once been the exclusive preserve of government. Hence, aid agencies were becoming involved in matters of local or even international politics. This study focuses on the comparison between Oxfam and Médecins Sans Frontières (MSF) or Doctor without Borders different attitudes against political influences in humanitarian aid. It aims to untangle the bewilderment whether the contradictory approach to politics will becoming a barrier to performing their principles as humanitarian actors and also the consequences of taking that one particular position. The analysis of quantitative data and qualitative literature analysis are presented. The findings indicated Oxfam is actively engaged with politics. It welcomed government and private sector to shared cooperation to reach its goals to alleviate global inequalities. On the other hand, MSF has always taken a strong position to refuse any politics influence within their aid programmes. With no financial assistance from any government, MSF is free from any direct politics intervention. Hence, it can work efficiently with a clear objective to respond the demand side pressures from the people in needs. It is still publicly against politic involvement in the humanitarian activity, but practically, it has been moving forward to politicization in its own definition.Keywords: humanitarian agencies, humanitarian intervention, humanitarian principles, politicization of humanitarianism
Procedia PDF Downloads 2773358 Opportunities for Precision Feed in Apiculture
Authors: John Michael Russo
Abstract:
Honeybees are important to our food system and continue to suffer from high rates of colony loss. Precision feed has brought many benefits to livestock cultivation and these should transfer to apiculture. However, apiculture has unique challenges. The objective of this research is to understand how principles of precision agriculture, applied to apiculture and feed specifically, might effectively improve state-of-the-art cultivation. The methodology surveys apicultural practice to build a model for assessment. First, a review of apicultural motivators is made. Feed method is then evaluated. Finally, precision feed methods are examined as accelerants with potential to advance the effectiveness of feed practice. Six important motivators emerge: colony loss, disease, climate change, site variance, operational costs, and competition. Feed practice itself is used to compensate for environmental variables. The research finds that the current state-of-the-art in apiculture feed focuses on critical challenges in the management of feed schedules which satisfy requirements of the bees, preserve potency, optimize environmental variables, and manage costs. Many of the challenges are most acute when feed is used to dispense medication. Technology such as RNA treatments have even more rigorous demands. Precision feed solutions focus on strategies which accommodate specific needs of individual livestock. A major component is data; they integrate precise data with methods that respond to individual needs. There is enormous opportunity for precision feed to improve apiculture through the integration of precision data with policies to translate data into optimized action in the apiary, particularly through automation.Keywords: precision agriculture, precision feed, apiculture, honeybees
Procedia PDF Downloads 78