Search results for: electronic intelligence
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3275

Search results for: electronic intelligence

305 A Scoping Review of the Relationship Between Oral Health and Wellbeing: The Myth and Reality

Authors: Heba Salama, Barry Gibson, Jennifer Burr

Abstract:

Introduction: It is often argued that better oral health leads to better wellbeing, and the goal of dental care is to improve wellbeing. Notwithstanding, to our best knowledge, there is a lack of evidence to support the relationship between oral health and wellbeing. Aim: The scoping review aims to examine current definitions of health and wellbeing as well as map the evidence to examine the relationship between oral health and wellbeing. Methods: The scoping review followed the Preferred Reporting Items for Systematic Reviews Extension for Scoping Review (PRISMA-ScR). A two-phase search strategy was followed because of the unmanageable number of hits returned. The first phase was to identify how well-being was conceptualised in oral health literacy, and the second phase was to search for extracted keywords. The extracted keywords were searched in four databases: PubMed, CINAHL, PsycINFO, and Web of Science. To limit the number of studies to a manageable amount, the search was limited to the open-access studies that have been published in the last five years (from 2018 to 2022). Results: Only eight studies (0.1%) of the 5455 results met the review inclusion criteria. Most of the included studies defined wellbeing based on the hedonic theory. And the Satisfaction with Life Scale is the most used. Although the research results are inconsistent, it has generally been shown that there is a weak or no association between oral health and wellbeing. Interpretation: The review revealed a very important point about how oral health literature uses loose definitions that have significant implications for empirical research. That results in misleading evidence-based conclusions. According to the review results, improving oral health is not a key factor in improving wellbeing. It appears that investing in oral health care to improve wellbeing is not a top priority to tell policymakers about. This does not imply that there should be no investment in oral health care to improve oral health. That could have an indirect link to wellbeing by eliminating the potential oral health-related barriers to quality of life that could represent the foundation of wellbeing. Limitation: Only the most recent five years (2018–2022), peer-reviewed English-language literature, and four electronic databases were included in the search. These restrictions were put in place to keep the volume of literature at a manageable level. This suggests that some significant studies might have been omitted. Furthermore, the study used a definition of wellbeing that is currently being evolved and might not everyone agrees with it. Conclusion: Whilst it is a ubiquitous argument that oral health is related to wellbeing, and this seems logical, there is little empirical evidence to support this claim. This question, therefore, requires much more detailed consideration. Funding: This project was funded by the Ministry of Higher Education and Scientific Research in Libya and Tripoli University.

Keywords: oral health, wellbeing, satisfaction, emotion, quality of life, oral health related quality of life

Procedia PDF Downloads 118
304 Real Estate Trend Prediction with Artificial Intelligence Techniques

Authors: Sophia Liang Zhou

Abstract:

For investors, businesses, consumers, and governments, an accurate assessment of future housing prices is crucial to critical decisions in resource allocation, policy formation, and investment strategies. Previous studies are contradictory about macroeconomic determinants of housing price and largely focused on one or two areas using point prediction. This study aims to develop data-driven models to accurately predict future housing market trends in different markets. This work studied five different metropolitan areas representing different market trends and compared three-time lagging situations: no lag, 6-month lag, and 12-month lag. Linear regression (LR), random forest (RF), and artificial neural network (ANN) were employed to model the real estate price using datasets with S&P/Case-Shiller home price index and 12 demographic and macroeconomic features, such as gross domestic product (GDP), resident population, personal income, etc. in five metropolitan areas: Boston, Dallas, New York, Chicago, and San Francisco. The data from March 2005 to December 2018 were collected from the Federal Reserve Bank, FBI, and Freddie Mac. In the original data, some factors are monthly, some quarterly, and some yearly. Thus, two methods to compensate missing values, backfill or interpolation, were compared. The models were evaluated by accuracy, mean absolute error, and root mean square error. The LR and ANN models outperformed the RF model due to RF’s inherent limitations. Both ANN and LR methods generated predictive models with high accuracy ( > 95%). It was found that personal income, GDP, population, and measures of debt consistently appeared as the most important factors. It also showed that technique to compensate missing values in the dataset and implementation of time lag can have a significant influence on the model performance and require further investigation. The best performing models varied for each area, but the backfilled 12-month lag LR models and the interpolated no lag ANN models showed the best stable performance overall, with accuracies > 95% for each city. This study reveals the influence of input variables in different markets. It also provides evidence to support future studies to identify the optimal time lag and data imputing methods for establishing accurate predictive models.

Keywords: linear regression, random forest, artificial neural network, real estate price prediction

Procedia PDF Downloads 103
303 Experimental Analysis of the Performance of a System for Freezing Fish Products Equipped with a Modulating Vapour Injection Scroll Compressor

Authors: Domenico Panno, Antonino D’amico, Hamed Jafargholi

Abstract:

This paper presents an experimental analysis of the performance of a system for freezing fish products equipped with a modulating vapour injection scroll compressor operating with R448A refrigerant. Freezing is a critical process for the preservation of seafood products, as it influences quality, food safety, and environmental sustainability. The use of a modulating scroll compressor with vapour injection, associated with the R448A refrigerant, is proposed as a solution to optimize the performance of the system, reducing energy consumption and mitigating the environmental impact. The stream injection modulating scroll compressor represents an advanced technology that allows you to adjust the compressor capacity based on the actual cooling needs of the system. Vapour injection allows the optimization of the refrigeration cycle, reducing the evaporation temperature and improving the overall efficiency of the system. The use of R448A refrigerant, with a low Global Warming Potential (GWP), is part of an environmental sustainability perspective, helping to reduce the climate impact of the system. The aim of this research was to evaluate the performance of the system through a series of experiments conducted on a pilot plant for the freezing of fish products. Several operational variables were monitored and recorded, including evaporation temperature, condensation temperature, energy consumption, and freezing time of seafood products. The results of the experimental analysis highlighted the benefits deriving from the use of the modulating vapour injection scroll compressor with the R448A refrigerant. In particular, a significant reduction in energy consumption was recorded compared to conventional systems. The modulating capacity of the compressor made it possible to adapt the cold production to variations in the thermal load, ensuring optimal operation of the system and reducing energy waste. Furthermore, the use of an electronic expansion valve highlighted greater precision in the control of the evaporation temperature, with minimal deviation from the desired set point. This helped ensure better quality of the final product, reducing the risk of damage due to temperature changes and ensuring uniform freezing of the fish products. The freezing time of seafood has been significantly reduced thanks to the configuration of the entire system, allowing for faster production and greater production capacity of the plant. In conclusion, the use of a modulating vapour injection scroll compressor operating with R448A has proven effective in improving the performance of a system for freezing fish products. This technology offers an optimal balance between energy efficiency, temperature control, and environmental sustainability, making it an advantageous choice for food industries.

Keywords: scroll compressor, vapor injection, refrigeration system, EER

Procedia PDF Downloads 45
302 The Effects of Computer Game-Based Pedagogy on Graduate Students Statistics Performance

Authors: Clement Yeboah, Eva Laryea

Abstract:

A pretest-posttest within subjects experimental design was employed to examine the effects of a computerized basic statistics learning game on achievement and statistics-related anxiety of students enrolled in introductory graduate statistics course. Participants (N = 34) were graduate students in a variety of programs at state-funded research university in the Southeast United States. We analyzed pre-test posttest differences using paired samples t-tests for achievement and for statistics anxiety. The results of the t-test for knowledge in statistics were found to be statistically significant, indicating significant mean gains for statistical knowledge as a function of the game-based intervention. Likewise, the results of the t-test for statistics-related anxiety were also statistically significant, indicating a decrease in anxiety from pretest to posttest. The implications of the present study are significant for both teachers and students. For teachers, using computer games developed by the researchers can help to create a more dynamic and engaging classroom environment, as well as improve student learning outcomes. For students, playing these educational games can help to develop important skills such as problem solving, critical thinking, and collaboration. Students can develop an interest in the subject matter and spend quality time to learn the course as they play the game without knowing that they are even learning the presupposed hard course. The future directions of the present study are promising as technology continues to advance and become more widely available. Some potential future developments include the integration of virtual and augmented reality into educational games, the use of machine learning and artificial intelligence to create personalized learning experiences, and the development of new and innovative game-based assessment tools. It is also important to consider the ethical implications of computer game-based pedagogy, such as the potential for games to perpetuate harmful stereotypes and biases. As the field continues to evolve, it will be crucial to address these issues and work towards creating inclusive and equitable learning experiences for all students. This study has the potential to revolutionize the way basic statistics graduate students learn and offers exciting opportunities for future development and research. It is an important area of inquiry for educators, researchers, and policymakers and will continue to be a dynamic and rapidly evolving field for years to come.

Keywords: pretest-posttest within subjects, computer game-based learning, statistics achievement, statistics anxiety

Procedia PDF Downloads 77
301 A Meta-Analysis of School-Based Suicide Prevention for Adolescents and Meta-Regressions of Contextual and Intervention Factors

Authors: E. H. Walsh, J. McMahon, M. P. Herring

Abstract:

Post-primary school-based suicide prevention (PSSP) is a valuable avenue to reduce suicidal behaviours in adolescents. The aims of this meta-analysis and meta-regression were 1) to quantify the effect of PSSP interventions on adolescent suicide ideation (SI) and suicide attempts (SA), and 2) to explore how intervention effects may vary based on important contextual and intervention factors. This study provides further support to the benefits of PSSP by demonstrating lower suicide outcomes in over 30,000 adolescents following PSSP and mental health interventions and tentatively suggests that intervention effectiveness may potentially vary based on intervention factors. The protocol for this study is registered on PROSPERO (ID=CRD42020168883). Population, intervention, comparison, outcomes, and study design (PICOs) defined eligible studies as cluster randomised studies (n=12) containing PSSP and measuring suicide outcomes. Aggregate electronic database EBSCO host, Web of Science, and Cochrane Central Register of Controlled Trials databases were searched. Cochrane bias tools for cluster randomised studies demonstrated that half of the studies were rated as low risk of bias. The Egger’s Regression Test adapted for multi-level modelling indicated that publication bias was not an issue (all ps > .05). Crude and corresponding adjusted pooled log odds ratios (OR) were computed using the Metafor package in R, yielding 12 SA and 19 SI effects. Multi-level random-effects models accounting for dependencies of effects from the same study revealed that in crude models, compared to controls, interventions were significantly associated with 13% (OR=0.87, 95% confidence interval (CI), [0.78,0.96], Q18 =15.41, p=0.63) and 34% (OR=0.66, 95%CI [0.47,0.91], Q10=16.31, p=0.13) lower odds of SI and SA, respectively. Adjusted models showed similar odds reductions of 15% (OR=0.85, 95%CI[0.75,0.95], Q18=10.04, p=0.93) and 28% (OR=0.72, 95%CI[0.59,0.87], Q10=10.46, p=0.49) for SI and SA, respectively. Within-cluster heterogeneity ranged from no heterogeneity to low heterogeneity for SA across crude and adjusted models (0-9%). No heterogeneity was identified for SI across crude and adjusted models (0%). Pre-specified univariate moderator analyses were not significant for SA (all ps < 0.05). Variations in average pooled SA odds reductions across categories of various intervention characteristics were observed (all ps < 0.05), which preliminarily suggests that the effectiveness of interventions may potentially vary across intervention factors. These findings have practical implications for researchers, clinicians, educators, and decision-makers. Further investigation of important logical, theoretical, and empirical moderators on PSSP intervention effectiveness is recommended to establish how and when PSSP interventions best reduce adolescent suicidal behaviour.

Keywords: adolescents, contextual factors, post-primary school-based suicide prevention, suicide ideation, suicide attempts

Procedia PDF Downloads 101
300 Public Procurement Development Stages in Georgia

Authors: Giorgi Gaprindashvili

Abstract:

One of the best examples, in evolution of the public procurement, from post-soviet countries are reforms carried out in Georgia, which brought them close to international standards of procurement. In Georgia, public procurement legislation started functioning in 1998. The reform has passed several stages and came in the form as it is today. It should also be noted, that countries with economy in transition, including Georgia, implemented all the reforms in public procurement based on recommendations and support of World Bank, the United Nations and other international organizations. The first law on public procurement in Georgia was adopted on December 9, 1998 which aimed regulation of the procurement process of budget-organizations, transparent and competitive environment for private companies to access state funds legally. The priorities were identified quite clearly in the wording of the law, but operation/function of this law could not be reached on its level, because of some objective and subjective reasons. The high level of corruption in all levels of governance, can be considered as a main obstacle reason and of course, it is natural, that it had direct impact on the procurement process, as well as on transparency and rational use of state funds. This circumstances were the reasons that reforms in this sphere continued, to improve procurement process, in particular, the first wave of reforms began in 2001. Public procurement agency carried out reform with World Bank with main purpose of smartening the procurement legislation and its harmonization with international treaties and agreements. Also with the support of World Bank various activities were carried out to raise awareness of participants involved in procurement system. Further major changes in the legislation were filed in May 2005, which was also directed towards the improvement and smarten of the procurement process. The third wave of the reform began in 2010, which more or less guaranteed the transparency of the procurement process, which later became the basis for the rational spending of state funds. The reform of the procurement system completely changed the procedures. Carried out reform in Georgia resulted in introducing new electronic tendering system, which benefit the transparency of the process, after this became the basis for the further development of a competitive environment, which become a prerequisite for the state rational spending. Increased number of supplier organizations participating in the procurement process resulted in reduction of the estimated cost and the actual cost from 20% up to 40%, it is quite large saving for the procuring organizations and allows them to use the freed-up funds for their other needs. Assessment of the reforms in Georgia in the field of public procurement can be concluded, that proper regulation of the sector and relevant policy may proceed to rational and transparent spending of the budget from country’s state institutions. Also, the business sector has the opportunity to work in competitive market conditions and to make a preliminary analysis, which is a prerequisite for future strategy and development.

Keywords: public administration, public procurement, reforms, transparency

Procedia PDF Downloads 366
299 [Keynote Talk]: Production Flow Coordination on Supply Chains: Brazilian Case Studies

Authors: Maico R. Severino, Laura G. Caixeta, Nadine M. Costa, Raísa L. T. Napoleão, Éverton F. V. Valle, Diego D. Calixto, Danielle Oliveira

Abstract:

One of the biggest barriers that companies find nowadays is the coordination of production flow in their Supply Chains (SC). In this study, coordination is understood as a mechanism for incorporating the entire production channel, with everyone involved focused on achieving the same goals. Sometimes, this coordination is attempted by the use of logistics practices or production plan and control methods. No papers were found in the literature that presented the combined use of logistics practices and production plan and control methods. The main objective of this paper is to propose solutions for six case studies combining logistics practices and Ordering Systems (OS). The methodology used in this study was a conceptual model of decision making. This model contains six phases: a) the analysis the types and characteristics of relationships in the SC; b) the choice of the OS; c) the choice of the logistics practices; d) the development of alternative proposals of combined use; e) the analysis of the consistency of the chosen alternative; f) the qualitative and quantitative assessment of the impact on the coordination of the production flow and the verification of applicability of the proposal in the real case. This study was conducted on six Brazilian SC of different sectors: footwear, food and beverages, garment, sugarcane, mineral and metal mechanical. The results from this study showed that there was improvement in the coordination of the production flow through the following proposals: a) for the footwear industry the use of Period Bath Control (PBC), Quick Response (QR) and Enterprise Resource Planning (ERP); b) for the food and beverage sector firstly the use of Electronic Data Interchange (EDI), ERP, Continuous Replenishment (CR) and Drum-Buffer-Rope Order (DBR) (for situations in which the plants of both companies are distant), and secondly EDI, ERP, Milk-Run and Review System Continues (for situations in which the plants of both companies are close); c) for the garment industry the use of Collaborative Planning, Forecasting, and Replenishment (CPFR) and Constant Work-In-Process (CONWIP) System; d) for the sugarcane sector the use of EDI, ERP and CONWIP System; e) for the mineral processes industry the use of Vendor Managed Inventory (VMI), EDI and MaxMin Control System; f) for the metal mechanical sector the use of CONWIP System and Continuous Replenishment (CR). It should be emphasized that the proposals are exclusively recommended for the relationship between client and supplier studied. Therefore, it cannot be generalized to other cases. However, what can be generalized is the methodology used to choose the best practices for each case. Based on the study, it can be concluded that the combined use of OS and logistics practices enable a better coordination of flow production on SC.

Keywords: supply chain management, production flow coordination, logistics practices, ordering systems

Procedia PDF Downloads 208
298 Enhanced Field Emission from Plasma Treated Graphene and 2D Layered Hybrids

Authors: R. Khare, R. V. Gelamo, M. A. More, D. J. Late, Chandra Sekhar Rout

Abstract:

Graphene emerges out as a promising material for various applications ranging from complementary integrated circuits to optically transparent electrode for displays and sensors. The excellent conductivity and atomic sharp edges of unique two-dimensional structure makes graphene a propitious field emitter. Graphene analogues of other 2D layered materials have emerged in material science and nanotechnology due to the enriched physics and novel enhanced properties they present. There are several advantages of using 2D nanomaterials in field emission based devices, including a thickness of only a few atomic layers, high aspect ratio (the ratio of lateral size to sheet thickness), excellent electrical properties, extraordinary mechanical strength and ease of synthesis. Furthermore, the presence of edges can enhance the tunneling probability for the electrons in layered nanomaterials similar to that seen in nanotubes. Here we report electron emission properties of multilayer graphene and effect of plasma (CO2, O2, Ar and N2) treatment. The plasma treated multilayer graphene shows an enhanced field emission behavior with a low turn on field of 0.18 V/μm and high emission current density of 1.89 mA/cm2 at an applied field of 0.35 V/μm. Further, we report the field emission studies of layered WS2/RGO and SnS2/RGO composites. The turn on field required to draw a field emission current density of 1μA/cm2 is found to be 3.5, 2.3 and 2 V/μm for WS2, RGO and the WS2/RGO composite respectively. The enhanced field emission behavior observed for the WS2/RGO nanocomposite is attributed to a high field enhancement factor of 2978, which is associated with the surface protrusions of the single-to-few layer thick sheets of the nanocomposite. The highest current density of ~800 µA/cm2 is drawn at an applied field of 4.1 V/μm from a few layers of the WS2/RGO nanocomposite. Furthermore, first-principles density functional calculations suggest that the enhanced field emission may also be due to an overlap of the electronic structures of WS2 and RGO, where graphene-like states are dumped in the region of the WS2 fundamental gap. Similarly, the turn on field required to draw an emission current density of 1µA/cm2 is significantly low (almost half the value) for the SnS2/RGO nanocomposite (2.65 V/µm) compared to pristine SnS2 (4.8 V/µm) nanosheets. The field enhancement factor β (~3200 for SnS2 and ~3700 for SnS2/RGO composite) was calculated from Fowler-Nordheim (FN) plots and indicates emission from the nanometric geometry of the emitter. The field emission current versus time plot shows overall good emission stability for the SnS2/RGO emitter. The DFT calculations reveal that the enhanced field emission properties of SnS2/RGO composites are because of a substantial lowering of work function of SnS2 when supported by graphene, which is in response to p-type doping of the graphene substrate. Graphene and 2D analogue materials emerge as a potential candidate for future field emission applications.

Keywords: graphene, layered material, field emission, plasma, doping

Procedia PDF Downloads 361
297 Evolution of Web Development Progress in Modern Information Technology

Authors: Abdul Basit Kiani

Abstract:

Web development, the art of creating and maintaining websites, has witnessed remarkable advancements. The aim is to provide an overview of some of the cutting-edge developments in the field. Firstly, the rise of responsive web design has revolutionized user experiences across devices. With the increasing prevalence of smartphones and tablets, web developers have adapted to ensure seamless browsing experiences, regardless of screen size. This progress has greatly enhanced accessibility and usability, catering to the diverse needs of users worldwide. Additionally, the evolution of web frameworks and libraries has significantly streamlined the development process. Tools such as React, Angular, and Vue.js have empowered developers to build dynamic and interactive web applications with ease. These frameworks not only enhance efficiency but also bolster scalability, allowing for the creation of complex and feature-rich web solutions. Furthermore, the emergence of progressive web applications (PWAs) has bridged the gap between native mobile apps and web development. PWAs leverage modern web technologies to deliver app-like experiences, including offline functionality, push notifications, and seamless installation. This innovation has transformed the way users interact with websites, blurring the boundaries between traditional web and mobile applications. Moreover, the integration of artificial intelligence (AI) and machine learning (ML) has opened new horizons in web development. Chatbots, intelligent recommendation systems, and personalization algorithms have become integral components of modern websites. These AI-powered features enhance user engagement, provide personalized experiences, and streamline customer support processes, revolutionizing the way businesses interact with their audiences. Lastly, the emphasis on web security and privacy has been a pivotal area of progress. With the increasing incidents of cyber threats, web developers have implemented robust security measures to safeguard user data and ensure secure transactions. Innovations such as HTTPS protocol, two-factor authentication, and advanced encryption techniques have bolstered the overall security of web applications, fostering trust and confidence among users. Hence, recent progress in web development has propelled the industry forward, enabling developers to craft innovative and immersive digital experiences. From responsive design to AI integration and enhanced security, the landscape of web development continues to evolve, promising a future filled with endless possibilities.

Keywords: progressive web applications (PWAs), web security, machine learning (ML), web frameworks, advancement responsive web design

Procedia PDF Downloads 54
296 Multidisciplinary Approach to Mio-Plio-Quaternary Aquifer Study in the Zarzis Region (Southeastern Tunisia)

Authors: Ghada Ben Brahim, Aicha El Rabia, Mohamed Hedi Inoubli

Abstract:

Climate change has exacerbated disparities in the distribution of water resources in Tunisia, resulting in significant degradation in quantity and quality over the past five decades. The Mio-Plio-Quaternary aquifer, the primary water source in the Zarzis region, is subject to climatic, geographical, and geological challenges, as well as human stress. The region is experiencing uneven distribution and growing threats from groundwater salinity and saltwater intrusion. Addressing this challenge is critical for the arid region’s socioeconomic development, and effective water resource management is required to combat climate change and reduce water deficits. This study uses a multidisciplinary approach to determine the groundwater potential of this aquifer, involving geophysics and hydrogeology data analysis. We used advanced techniques such as 3D Euler deconvolution and power spectrum analysis to generate detailed anomaly maps and estimate the depths of density sources, identifying significant Bouguer anomalies trending E-W, NW-SE, and NE-SW. Various techniques, such as wavelength filtering, upward continuation, and horizontal and vertical derivatives, were used to improve the gravity data, resulting in consistent results for anomaly shapes and amplitudes. The Euler deconvolution method revealed two prominent surface faults, trending NE-SW and NW-SE, that have a significant impact on the distribution of sedimentary facies and water quality within the Mio-Plio-Quaternary aquifer. Additionally, depth maxima greater than 1400 m to the North indicate the presence of a Cretaceous paleo-fault. Geoelectrical models and resistivity pseudo-sections were used to interpret the distribution of electrical facies in the Mio-Plio-Quaternary aquifer, highlighting lateral variation and depositional environment type. AI optimises the analysis and interpretation of exploration data, which is important to long-term management and water security. Machine learning algorithms and deep learning models analyse large datasets to provide precise interpretations of subsurface conditions, such as aquifer salinisation. However, AI has limitations, such as the requirement for large datasets, the risk of overfitting, and integration issues with traditional geological methods.

Keywords: mio-plio-quaternary aquifer, Southeastern Tunisia, geophysical methods, hydrogeological analysis, artificial intelligence

Procedia PDF Downloads 14
295 The Principle of a Thought Formation: The Biological Base for a Thought

Authors: Ludmila Vucolova

Abstract:

The thought is a process that underlies consciousness and cognition and understanding its origin and processes is a longstanding goal of many academic disciplines. By integrating over twenty novel ideas and hypotheses of this theoretical proposal, we can speculate that thought is an emergent property of coded neural events, translating the electro-chemical interactions of the body with its environment—the objects of sensory stimulation, X, and Y. The latter is a self- generated feedback entity, resulting from the arbitrary pattern of the motion of a body’s motor repertory (M). A culmination of these neural events gives rise to a thought: a state of identity between an observed object X and a symbol Y. It manifests as a “state of awareness” or “state of knowing” and forms our perception of the physical world. The values of the variables of a construct—X (object), S1 (sense for the perception of X), Y (object), S2 (sense for perception of Y), and M (motor repertory that produces Y)—will specify the particular conscious percept at any given time. The proposed principle of interaction between the elements of a construct (X, Y, S1, S2, M) is universal and applies for all modes of communication (normal, deaf, blind, deaf and blind people) and for various language systems (Chinese, Italian, English, etc.). The particular arrangement of modalities of each of the three modules S1 (5 of 5), S2 (1 of 3), and M (3 of 3) defines a specific mode of communication. This multifaceted paradigm demonstrates a predetermined pattern of relationships between X, Y, and M that passes from generation to generation. The presented analysis of a cognitive experience encompasses the key elements of embodied cognition theories and unequivocally accords with the scientific interpretation of cognition as the mental action or process of acquiring knowledge and understanding through thought, experience, and the senses, and cognition means thinking and awareness. By assembling the novel ideas presented in twelve sections, we can reveal that in the invisible “chaos”, there is an order, a structure with landmarks and principles of operations and mental processes (thoughts) are physical and have a biological basis. This innovative proposal explains the phenomenon of mental imagery; give the first insight into the relationship between mental states and brain states, and support the notion that mind and body are inseparably connected. The findings of this theoretical proposal are supported by the current scientific data and are substantiated by the records of the evolution of language and human intelligence.

Keywords: agent, awareness, cognitive, element, experience, feedback, first person, imagery, language, mental, motor, object, sensory, symbol, thought

Procedia PDF Downloads 384
294 Developing a Model to Objectively Assess the Culture of Individuals and Teams in Order to Effectively and Efficiently Achieve Sustainability in the Manpower

Authors: Ahmed Mohamed Elnady Mohamed Elsafty

Abstract:

This paper explains a developed applied objective model to measure the culture qualitatively and quantitatively, whether in individuals or in teams, in order to be able to use culture correctly or modify it efficiently. This model provides precise measurements and consistent interpretations by being comprehensive, updateable, and protected from being misled by imitations. Methodically, the provided model divides the culture into seven dimensions (total 43 cultural factors): First dimension is outcome-orientation which consists of five factors and should be highest in leaders. Second dimension is details-orientation which consists of eight factors and should be in highest intelligence members. Third dimension is team-orientation which consists of five factors and should be highest in instructors or coaches. Fourth dimension is change-orientation which consists of five factors and should be highest in soldiers. Fifth dimension is people-orientation which consists of eight factors and should be highest in media members. Sixth dimension is masculinity which consists of seven factors and should be highest in hard workers. Last dimension is stability which consists of seven factors and should be highest in soft workers. In this paper, the details of all cultural factors are explained. Practically, information collection about each cultural factor in the targeted person or team is essential in order to calculate the degrees of all cultural factors using the suggested equation of multiplying 'the score of factor presence' by 'the score of factor strength'. In this paper, the details of how to build each score are explained. Based on the highest degrees - to identify which cultural dimension is the prominent - choosing the tested individual or team in the supposedly right position at the right time will provide a chance to use minimal efforts to make everyone aligned to the organization’s objectives. In other words, making everyone self-motivated by setting him/her at the right source of motivation is the most effective and efficient method to achieve high levels of competency, commitment, and sustainability. Modifying a team culture can be achieved by excluding or including new members with relatively high or low degrees in specific cultural factors. For conclusion, culture is considered as the software of the human beings and it is one of the major compression factors on the managerial discretion. It represents the behaviors, attitudes, and motivations of the human resources which are vital to enhance quality and safety, expanding the market share, and defending against attacks from external environments. Thus, it is tremendously essential and useful to use such a comprehensive model to measure, use, and modify culture.

Keywords: culture dimensions, culture factors, culture measurement, cultural analysis, cultural modification, self-motivation, alignment to objectives, competency, sustainability

Procedia PDF Downloads 164
293 Raman Spectroscopy Analysis of MnTiO₃-TiO₂ Eutectic

Authors: Adrian Niewiadomski, Barbara Surma, Katarzyna Kolodziejak, Dorota A. Pawlak

Abstract:

Oxide-oxide eutectic is attracting increasing interest of scientific community because of their unique properties and numerous potential applications. Some of the most interesting examples of applications are metamaterials, glucose sensors, photoactive materials, thermoelectric materials, and photocatalysts. Their unique properties result from the fact that composite materials consist of two or more phases. As a result, these materials have additive and product properties. Additive properties originate from particular phases while product properties originate from the interaction between phases. MnTiO3-TiO2 eutectic is one of such materials. TiO2 is a well-known semiconductor, and it is used as a photocatalyst. Moreover, it may be used to produce solar cells, in a gas sensing devices and in electrochemistry. MnTiO3 is a semiconductor and antiferromagnetic. Therefore it has potential application in integrated circuits devices, and as a gas and humidity sensor, in non-linear optics and as a visible-light activated photocatalyst. The above facts indicate that eutectic MnTiO3-TiO2 constitutes an extremely promising material that should be studied. Despite that Raman spectroscopy is a powerful method to characterize materials, to our knowledge Raman studies of eutectics are very limited, and there are no studies of the MnTiO3-TiO2 eutectic. While to our knowledge the papers regarding this material are scarce. The MnTiO3-TiO2 eutectic, as well as TiO2 and MnTiO3 single crystals, were grown by the micro-pulling-down method at the Institute of Electronic Materials Technology in Warsaw, Poland. A nitrogen atmosphere was maintained during whole crystal growth process. The as-grown samples of MnTiO3-TiO2 eutectic, as well as TiO2 and MnTiO3 single crystals, are black and opaque. Samples were cut perpendicular to the growth direction. Cross sections were examined with scanning electron microscopy (SEM) and with Raman spectroscopy. The present studies showed that maintaining nitrogen atmosphere during crystal growth process may result in obtaining black TiO2 crystals. SEM and Raman experiments showed that studied eutectic consists of three distinct regions. Furthermore, two of these regions correspond with MnTiO3, while the third region corresponds with the TiO2-xNx phase. Raman studies pointed out that TiO2-xNx phase crystallizes in rutile structure. The studies show that Raman experiments may be successfully used to characterize eutectic materials. The MnTiO3-TiO2 eutectic was grown by the micro-pulling-down method. SEM and micro-Raman experiments were used to establish phase composition of studied eutectic. The studies revealed that the TiO2 phase had been doped with nitrogen. Therefore the TiO2 phase is, in fact, a solid solution with TiO2-xNx composition. The remaining two phases exhibit Raman lines of both rutile TiO2 and MnTiO3. This points out to some kind of coexistence of these phases in studied eutectic.

Keywords: compound materials, eutectic growth and characterization, Raman spectroscopy, rutile TiO₂

Procedia PDF Downloads 193
292 Variations in Breast Aesthetic Reconstruction Rates between Asian and Caucasian Patients Post Mastectomy in a UK Tertiary Breast Referral Centre: A Five-Year Institutional Review

Authors: Wisam Ismail, Chole Wright, Elizabeth Baker, Cathy Tait, Mohamed Salhab, Richard Linforth

Abstract:

Background: Post-mastectomy breast reconstruction is an important treatment option for women with breast cancer with psychosocial, emotional and quality of life benefits. Despite this, Asian patients are one-fifth as likely as Caucasian patients to undergo reconstruction after mastectomy. Aim: This study aimed to assess the difference in breast reconstruction rates between Asian and Caucasian patients treated at Bradford Teaching Hospitals between May 2011 – December 2015.The long-term goal is to equip healthcare professionals to improve breast cancer treatment outcome by increasing breast reconstruction rates in this sub-population. Methods: All patients undergoing mastectomy were identified using a prospectively collected departmental database. Further data was obtained via retrospective electronic case note review. Bradford city population is about 530.000 by the end of 2015, with 67.44% of the city's population was White ethnic groups and 26.83% Asian Ethnic Groups (UK population consensus). The majority of Asian population speaks Urdu, hence an Urdu speaking breast care nurse was appointed to facilitate communications and deliver a better understanding of the reconstruction options and pathways. Statistical analysis was undertaken using the SAS program. Patients were stratified by age, self-reported ethnicity, axillary surgery and reconstruction. Relative odds were calculated using univariate and multivariate logistic regression analyses with adjustment for known confounders. An Urdu speaking breast care nurse was employed throughout this period to facilitate communication and patient decision making. Results: 506 patients underwent Mastectomy over 5 years. 72 (14%) Asian v. 434 (85%) Caucasian. Overall median age is 64 years (SD1.1). Asian median age is 62 (SD0.9), versus Caucasian 65 (SD1.2). Total axillary clearance rate was 30% (42% Asian v.30% Caucasian). Overall reconstruction rate was 126 patients (28.9%).Only 6 of 72 Asian patients (<1%) underwent breast reconstruction versus 121of 434 Caucasian (28%) (p < 0.04), Odds ratio 0.68, (95% confidence interval 0.57-0.79). Conclusions: There is a significant difference in post-mastectomy reconstruction rates between Asian and Caucasian patients. This difference is likely to be multi-factorial. Higher rates of axillary clearance in Asian patients might suggest later disease presentation and/or higher rates of subsequent adjuvant therapy, both of which, can impact on the suitability of breast reconstruction. Strategies aimed at reducing racial disparities in breast reconstruction should include symptom awareness to enable earlier presentation and facilitated communication to ensure informed decision-making.

Keywords: aesthetic, Asian, breast, reconstruction

Procedia PDF Downloads 276
291 The Effectiveness of Psychosocial Interventions for Survivors of Natural Disasters: A Systematic Review

Authors: Santhani M. Selveindran

Abstract:

Background: Natural disasters are traumatic global events that are becoming increasing more common, with significant psychosocial impact on survivors. This impact results not only in psychosocial distress but, for many, can lead to psychosocial disorders and chronic psychopathology. While there are currently available interventions that seek to prevent and treat these psychosocial sequelae, their effectiveness is uncertain. The evidence-base is emerging with more primary studies evaluating the effectiveness of various psychosocial interventions for survivors of natural disasters, which remains to be synthesized. Aim of Review: To identify, critically appraise and synthesize the current evidence-base on the effectiveness of psychosocial interventions in preventing or treating Post-Traumatic Stress Disorder (PTSD), Major Depressive Disorder (MDD) and/or Generalized Anxiety Disorder (GAD) in adults and children who are survivors of natural disasters. Methods: A protocol was developed as a guide to carry out this review. A systematic search was conducted in eight international electronic databases, three grey literature databases, one dissertation and thesis repository, websites of six humanitarian and non-governmental organizations renowned for their work on natural disasters, as well as bibliographic and citation searching for eligible articles. Papers meeting the specific inclusion criteria underwent quality assessment using the Downs and Black checklist. Data were extracted from the included papers and analysed by way of narrative synthesis. Results: Database and website searching returned 3777 papers where 31 met the criteria for inclusion. Additional 2 papers were obtained through bibliographic and citation searching. Methodological quality of most papers was fair. Twenty-five studies evaluated psychological interventions, five, social interventions whereas three studies evaluated ‘mixed’ psychological and social interventions. All studies, irrespective of methodological quality, reported post-intervention reductions in symptom scores for PTSD, depression and/or anxiety and where assessed, reduced diagnosis of PTSD and MDD, and produced improvements in self-efficacy and quality of life. Statistically significant results were seen in 27 studies. However, three studies demonstrated that the evaluated interventions may not have been very beneficial. Conclusions: The overall positive results suggest that any psychosocial interventions are favourable and should be delivered to all natural disaster survivors, irrespective of age, country, and phase of disaster. Yet, heterogeneity and methodological shortcomings of the current evidence-base makes it difficult to draw definite conclusions needed to formulate categorical guidance or frameworks. Further, rigorously conducted research is needed in this area, although the feasibility of such, given the context and nature of the problem, is also recognized.

Keywords: psychosocial interventions, natural disasters, survivors, effectiveness

Procedia PDF Downloads 154
290 Evaluating the Teaching and Learning Value of Tablets

Authors: Willem J. A. Louw

Abstract:

The wave of new advanced computing technology that has been developed during the recent past has significantly changed the way we communicate, collaborate and collect information. It has created a new technology environment and paradigm in which our children and students grow-up and this impacts on their learning. Research confirmed that Generation Y students have a preference for learning in the new technology environment. The challenge or question is: How do we adjust our teaching and learning to make the most of these changes. The complexity of effective and efficient teaching and learning must not be underestimated and changes must be preceded by proper objective research to prevent any haphazard developments that could do more harm than benefit. A blended learning approach has been used in the Forestry department for a few numbers of years including the use of electronic-peer assisted learning (e-pal) in a fixed-computer set-up within a learning management system environment. It was decided to extend the investigation and do some exploratory research by using a range of different Tablet devices. For this purpose, learning activities or assignments were designed to cover aspects of communication, collaboration and collection of information. The Moodle learning management system was used to present normal module information, to communicate with students and for feedback and data collection. Student feedback was collected by using an online questionnaire and informal discussions. The research project was implemented in 2013, 2014 and 2015 amongst first and third-year students doing a forestry three-year technical tertiary qualification in commercial plantation management. In general, more than 80% of the students alluded to that the device was very useful in their learning environment while the rest indicated that the devices were not very useful. More than ninety percent of the students acknowledged that they would like to continue using the devices for all of their modules whilst the rest alluded to functioning efficiently without the devices. Results indicated that information collection (access to resources) was rated the highest advantageous factor followed by communication and collaboration. The main general advantages of using Tablets were listed by the students as being mobility (portability), 24/7 access to learning material and information of any kind on a user friendly device in a Wi-Fi environment, fast computing process speeds, saving time, effort and airtime through skyping and e-mail, and use of various applications. Ownership of the device is a critical factor while the risk was identified as a major potential constraint. Significant differences were reported between the different types and quality of Tablets. The preferred types are those with a bigger screen and the ones with overall better functionality and quality features. Tablets significantly increase the collaboration, communication and information collection needs of the students. It does, however, not replace the need of a computer/laptop because of limited storage and computation capacity, small screen size and inefficient typing.

Keywords: tablets, teaching, blended learning, tablet quality

Procedia PDF Downloads 248
289 A Systematic Review of Chronic Neurologic Complications of COVID-19; A Potential Risk Factor for Narcolepsy, Parkinson's Disease, and Multiple Sclerosis.

Authors: Sulemana Saibu, Moses Ikpeme

Abstract:

Background: The severity of the COVID-19 pandemic, brought on by the SARS-CoV-2 coronavirus, has been unprecedented since the 1918 influenza pandemic. SARS-CoV-2 cases of CNS and peripheral nervous system disease, including neurodegenerative disorders and chronic immune-mediated diseases, may be anticipated based on knowledge of past coronaviruses, particularly those that caused the severe acute respiratory syndrome and Middle East respiratory syndrome outbreaks. Although respiratory symptoms are the most common clinical presentation, neurological symptoms are becoming increasingly recognized, raising concerns about their potential role in causing Parkinson's disease, Multiple sclerosis, and Narcolepsy. This systematic review aims to summarize the current evidence by exploring the association between COVID-19 infection and how it may overlap with etiological mechanisms resulting in Narcolepsy, Parkinson's disease, and Multiple sclerosis. Methods: A systematic search was conducted using electronic databases ((PubMed/MedLine, Embase, PsycINFO, ScieLO, Web of Science, ProQuest (Biotechnology, Virology, and AIDS), Scopus, and CINAHL)) to identify studies published between January 2020 and December 2022 that investigated the association between COVID-19 and Parkinson's disease, multiple sclerosis, and Narcolepsy. Per the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines, the review was performed and reported. Study quality was assessed using the Critical Appraisal Skills Programme Checklist and the Joanna Briggs Institute Critical appraisal tools. Results: A total of 21 studies out of 1025 met the inclusion criteria, including 8 studies reporting Parkinson's disease, 11 on multiple sclerosis, and 2 on Narcolepsy. In COVID-19 individuals compared to the general population, Narcolepsy, Parkinson's disease, and multiple sclerosis were shown to have a higher incidence. The findings imply that COVID-19 may worsen the signs or induce multiple sclerosis and Parkinson's disease and may raise the risk of developing Narcolepsy. Further research is required to confirm these connections because the available data is insufficient. Conclusion: According to the existing data, COVID-19 may raise the risk of Narcolepsy and have a causative relationship with Parkinson's disease, multiple sclerosis, and other diseases. More study is required to confirm these correlations and pinpoint probable mechanisms behind these interactions. Clinicians should be aware of how COVID-19 may affect various neurological illnesses and should treat patients who are affected accordingly.

Keywords: COVID-19, parkinson’s disease, multiple sclerosis, narcolepsy, neurological disorders, sars-cov-2, neurodegenerative disorders, chronic immune-mediated diseases

Procedia PDF Downloads 84
288 A Literature Review on the Use of Information and Communication Technology within and between Emergency Medical Teams during a Disaster

Authors: Badryah Alshehri, Kevin Gormley, Gillian Prue, Karen McCutcheon

Abstract:

In a disaster event, sharing patient information between the pre-hospitals Emergency Medical Services (EMS) and Emergency Department (ED) hospitals is a complex process during which important information may be altered or lost due to poor communication. The aim of this study was to critically discuss the current evidence base in relation to communication between pre-EMS hospital and ED hospital professionals by the use of Information and Communication Systems (ICT). This study followed the systematic approach; six electronic databases were searched: CINAHL, Medline, Embase, PubMed, Web of Science, and IEEE Xplore Digital Library were comprehensively searched in January 2018 and a second search was completed in April 2020 to capture more recent publications. The study selection process was undertaken independently by the study authors. Both qualitative and quantitative studies were chosen that focused on factors which are positively or negatively associated with coordinated communication between pre-hospital EMS and ED teams in a disaster event. These studies were assessed for quality and the data were analysed according to the key screening themes which emerged from the literature search. Twenty-two studies were included. Eleven studies employed quantitative methods, seven studies used qualitative methods, and four studies used mixed methods. Four themes emerged on communication between EMTs (pre-hospital EMS and ED staff) in a disaster event using the ICT. (1) Disaster preparedness plans and coordination. This theme reported that disaster plans are in place in hospitals, and in some cases, there are interagency agreements with pre-hospital and relevant stakeholders. However, the findings showed that the disaster plans highlighted in these studies lacked information regarding coordinated communications within and between the pre-hospital and hospital. (2) Communication systems used in the disaster. This theme highlighted that although various communication systems are used between and within hospitals and pre-hospitals, technical issues have influenced communication between teams during disasters. (3) Integrated information management systems. This theme suggested the need for an integrated health information system which can help pre-hospital and hospital staff to record patient data and ensure the data is shared. (4) Disaster training and drills. While some studies analysed disaster drills and training, the majority of these studies were focused on hospital departments other than EMTs. These studies suggest the need for simulation disaster training and drills, including EMTs. This review demonstrates that considerable gaps remain in the understanding of the communication between the EMS and ED hospitals staff in relation to response in disasters. The review shows that although different types of ICTs are used, various issues remain which affect coordinated communication among the relevant professionals.

Keywords: communication, emergency communication services, emergency medical teams, emergency physicians, emergency nursing, paramedics, information and communication technology, communication systems

Procedia PDF Downloads 86
287 Using Chatbots to Create Situational Content for Coursework

Authors: B. Bricklin Zeff

Abstract:

This research explores the development and application of a specialized chatbot tailored for a nursing English course, with a primary objective of augmenting student engagement through situational content and responsiveness to key expressions and vocabulary. Introducing the chatbot, elucidating its purpose, and outlining its functionality are crucial initial steps in the research study, as they provide a comprehensive foundation for understanding the design and objectives of the specialized chatbot developed for the nursing English course. These elements establish the context for subsequent evaluations and analyses, enabling a nuanced exploration of the chatbot's impact on student engagement and language learning within the nursing education domain. The subsequent exploration of the intricate language model development process underscores the fusion of scientific methodologies and artistic considerations in this application of artificial intelligence (AI). Tailored for educators and curriculum developers in nursing, practical principles extending beyond AI and education are considered. Some insights into leveraging technology for enhanced language learning in specialized fields are addressed, with potential applications of similar chatbots in other professional English courses. The overarching vision is to illuminate how AI can transform language learning, rendering it more interactive and contextually relevant. The presented chatbot is a tangible example, equipping educators with a practical tool to enhance their teaching practices. Methodologies employed in this research encompass surveys and discussions to gather feedback on the chatbot's usability, effectiveness, and potential improvements. The chatbot system was integrated into a nursing English course, facilitating the collection of valuable feedback from participants. Significant findings from the study underscore the chatbot's effectiveness in encouraging more verbal practice of target expressions and vocabulary necessary for performance in role-play assessment strategies. This outcome emphasizes the practical implications of integrating AI into language education in specialized fields. This research holds significance for educators and curriculum developers in the nursing field, offering insights into integrating technology for enhanced English language learning. The study's major findings contribute valuable perspectives on the practical impact of the chatbot on student interaction and verbal practice. Ultimately, the research sheds light on the transformative potential of AI in making language learning more interactive and contextually relevant, particularly within specialized domains like nursing.

Keywords: chatbot, nursing, pragmatics, role-play, AI

Procedia PDF Downloads 65
286 AI-Enabled Smart Contracts for Reliable Traceability in the Industry 4.0

Authors: Harris Niavis, Dimitra Politaki

Abstract:

The manufacturing industry was collecting vast amounts of data for monitoring product quality thanks to the advances in the ICT sector and dedicated IoT infrastructure is deployed to track and trace the production line. However, industries have not yet managed to unleash the full potential of these data due to defective data collection methods and untrusted data storage and sharing. Blockchain is gaining increasing ground as a key technology enabler for Industry 4.0 and the smart manufacturing domain, as it enables the secure storage and exchange of data between stakeholders. On the other hand, AI techniques are more and more used to detect anomalies in batch and time-series data that enable the identification of unusual behaviors. The proposed scheme is based on smart contracts to enable automation and transparency in the data exchange, coupled with anomaly detection algorithms to enable reliable data ingestion in the system. Before sensor measurements are fed to the blockchain component and the smart contracts, the anomaly detection mechanism uniquely combines artificial intelligence models to effectively detect unusual values such as outliers and extreme deviations in data coming from them. Specifically, Autoregressive integrated moving average, Long short-term memory (LSTM) and Dense-based autoencoders, as well as Generative adversarial networks (GAN) models, are used to detect both point and collective anomalies. Towards the goal of preserving the privacy of industries' information, the smart contracts employ techniques to ensure that only anonymized pointers to the actual data are stored on the ledger while sensitive information remains off-chain. In the same spirit, blockchain technology guarantees the security of the data storage through strong cryptography as well as the integrity of the data through the decentralization of the network and the execution of the smart contracts by the majority of the blockchain network actors. The blockchain component of the Data Traceability Software is based on the Hyperledger Fabric framework, which lays the ground for the deployment of smart contracts and APIs to expose the functionality to the end-users. The results of this work demonstrate that such a system can increase the quality of the end-products and the trustworthiness of the monitoring process in the smart manufacturing domain. The proposed AI-enabled data traceability software can be employed by industries to accurately trace and verify records about quality through the entire production chain and take advantage of the multitude of monitoring records in their databases.

Keywords: blockchain, data quality, industry4.0, product quality

Procedia PDF Downloads 189
285 Optimization Principles of Eddy Current Separator for Mixtures with Different Particle Sizes

Authors: Cao Bin, Yuan Yi, Wang Qiang, Amor Abdelkader, Ali Reza Kamali, Diogo Montalvão

Abstract:

The study of the electrodynamic behavior of non-ferrous particles in time-varying magnetic fields is a promising area of research with wide applications, including recycling of non-ferrous metals, mechanical transmission, and space debris. The key technology for recovering non-ferrous metals is eddy current separation (ECS), which utilizes the eddy current force and torque to separate non-ferrous metals. ECS has several advantages, such as low energy consumption, large processing capacity, and no secondary pollution, making it suitable for processing various mixtures like electronic scrap, auto shredder residue, aluminum scrap, and incineration bottom ash. Improving the separation efficiency of mixtures with different particle sizes in ECS can create significant social and economic benefits. Our previous study investigated the influence of particle size on separation efficiency by combining numerical simulations and separation experiments. Pearson correlation analysis found a strong correlation between the eddy current force in simulations and the repulsion distance in experiments, which confirmed the effectiveness of our simulation model. The interaction effects between particle size and material type, rotational speed, and magnetic pole arrangement were examined. It offer valuable insights for the design and optimization of eddy current separators. The underlying mechanism behind the effect of particle size on separation efficiency was discovered by analyzing eddy current and field gradient. The results showed that the magnitude and distribution heterogeneity of eddy current and magnetic field gradient increased with particle size in eddy current separation. Based on this, we further found that increasing the curvature of magnetic field lines within particles could also increase the eddy current force, providing a optimized method to improving the separation efficiency of fine particles. By combining the results of the studies, a more systematic and comprehensive set of optimization guidelines can be proposed for mixtures with different particle size ranges. The separation efficiency of fine particles could be improved by increasing the rotational speed, curvature of magnetic field lines, and electrical conductivity/density of materials, as well as utilizing the eddy current torque. When designing an ECS, the particle size range of the target mixture should be investigated in advance, and the suitable parameters for separating the mixture can be fixed accordingly. In summary, these results can guide the design and optimization of ECS, and also expand the application areas for ECS.

Keywords: eddy current separation, particle size, numerical simulation, metal recovery

Procedia PDF Downloads 89
284 Mesoporous BiVO4 Thin Films as Efficient Visible Light Driven Photocatalyst

Authors: Karolina Ordon, Sandrine Coste, Malgorzata Makowska-Janusik, Abdelhadi Kassiba

Abstract:

Photocatalytic processes play key role in the production of a new source of energy (as hydrogen), design of self-cleaning surfaces or for the environment preservation. The most challenging task deals with the purification of water distinguished by high efficiency. In the mentioned process, organic pollutants in solutions are decomposed to the simple, non-toxic compounds as H2O and CO2. The most known photocatalytic materials are ZnO, CdS and TiO2 semiconductors with a particular involvement of TiO2 as an efficient photocatalysts even with a high band gap equal to 3.2 eV which exploit only UV radiation from solar emitted spectrum. However, promising material with visible light induced photoactivity was searched through the monoclinic polytype of BiVO4 which has energy gap about 2.4 eV. As required in heterogeneous photocatalysis, the high contact surface is required. Also, BiVO4 as photocatalyst can be optimized by increasing its surface area by achieving the mesoporous structure synthesize. The main goal of the present work consists in the synthesis and characterization of BiVO4 mesoporous thin film. The synthesis method based on sol-gel was carried out using a standard surfactants such as P123 and F127. The thin film was deposited by spin and dip coating method. Then, the structural analysis of the obtained material was performed thanks to X-ray diffraction (XRD) and Raman spectroscopy. The surface of resulting structure was investigated using a scanning electron microscopy (SEM). The computer simulations based on modeling the optical and electronic properties of bulk BiVO4 by using DFT (density functional theory) methodology were carried out. The semiempirical parameterized method PM6 was used to compute the physical properties of BiVO4 nanostructures. The Raman and IR absorption spectra were also measured for synthesized mesoporous material, and the results were compared with the theoretical predictions. The simulations of nanostructured BiVO4 have pointed out the occurrence of quantum confinement for nanosized clusters leading to widening of the band gap. This result overcame the relevance of nanosized objects to harvest wide part of the solar spectrum. Also, a balance was searched experimentally through the mesoporous nature of the films devoted to enhancing the contact surface as required for heterogeneous catalysis without to lower the nanocrystallite size under some critical sizes inducing an increased band gap. The present contribution will discuss the relevant features of the mesoporous films with respect to their photocatalytic responses.

Keywords: bismuth vanadate, photocatalysis, thin film, quantum-chemical calculations

Procedia PDF Downloads 323
283 The Effects of Computer Game-Based Pedagogy on Graduate Students Statistics Performance

Authors: Eva Laryea, Clement Yeboah Authors

Abstract:

A pretest-posttest within subjects, experimental design was employed to examine the effects of a computerized basic statistics learning game on achievement and statistics-related anxiety of students enrolled in introductory graduate statistics course. Participants (N = 34) were graduate students in a variety of programs at state-funded research university in the Southeast United States. We analyzed pre-test posttest differences using paired samples t-tests for achievement and for statistics anxiety. The results of the t-test for knowledge in statistics were found to be statistically significant indicating significant mean gains for statistical knowledge as a function of the game-based intervention. Likewise, the results of the t-test for statistics-related anxiety were also statistically significant indicating a decrease in anxiety from pretest to posttest. The implications of the present study are significant for both teachers and students. For teachers, using computer games developed by the researchers can help to create a more dynamic and engaging classroom environment, as well as improve student learning outcomes. For students, playing these educational games can help to develop important skills such as problem solving, critical thinking, and collaboration. Students can develop interest in the subject matter and spend quality time to learn the course as they play the game without knowing that they are even learning the presupposed hard course. The future directions of the present study are promising, as technology continues to advance and become more widely available. Some potential future developments include the integration of virtual and augmented reality into educational games, the use of machine learning and artificial intelligence to create personalized learning experiences, and the development of new and innovative game-based assessment tools. It is also important to consider the ethical implications of computer game-based pedagogy, such as the potential for games to perpetuate harmful stereotypes and biases. As the field continues to evolve, it will be crucial to address these issues and work towards creating inclusive and equitable learning experiences for all students. This study has the potential to revolutionize the way basic statistics graduate students learn and offers exciting opportunities for future development and research. It is an important area of inquiry for educators, researchers, and policymakers, and will continue to be a dynamic and rapidly evolving field for years to come.

Keywords: pretest-posttest within subjects, experimental design, achievement, statistics-related anxiety

Procedia PDF Downloads 58
282 Chemical Technology Approach for Obtaining Carbon Structures Containing Reinforced Ceramic Materials Based on Alumina

Authors: T. Kuchukhidze, N. Jalagonia, T. Archuadze, G. Bokuchava

Abstract:

The growing scientific-technological progress in modern civilization causes actuality of producing construction materials which can successfully work in conditions of high temperature, radiation, pressure, speed, and chemically aggressive environment. Such extreme conditions can withstand very few types of materials and among them, ceramic materials are in the first place. Corundum ceramics is the most useful material for creation of constructive nodes and products of various purposes for its low cost, easy accessibility to raw materials and good combination of physical-chemical properties. However, ceramic composite materials have one disadvantage; they are less plastics and have lower toughness. In order to increase the plasticity, the ceramics are reinforced by various dopants, that reduces the growth of the cracks. It is shown, that adding of even small amount of carbon fibers and carbon nanotubes (CNT) as reinforcing material significantly improves mechanical properties of the products, keeping at the same time advantages of alundum ceramics. Graphene in composite material acts in the same way as inorganic dopants (MgO, ZrO2, SiC and others) and performs the role of aluminum oxide inhibitor, as it creates shell, that gives possibility to reduce sintering temperature and at the same time it acts as damper, because scattering of a shock wave takes place on carbon structures. Application of different structural modification of carbon (graphene, nanotube and others) as reinforced material, gives possibility to create multi-purpose highly requested composite materials based on alundum ceramics. In the present work offers simplified technology for obtaining of aluminum oxide ceramics, reinforced with carbon nanostructures, during which chemical modification with doping carbon nanostructures will be implemented in the process of synthesis of final powdery composite – Alumina. In charge doping carbon nanostructures connected to matrix substance with C-O-Al bonds, that provide their homogeneous spatial distribution. In ceramic obtained as a result of consolidation of such powders carbon fragments equally distributed in the entire matrix of aluminum oxide, that cause increase of bending strength and crack-resistance. The proposed way to prepare the charge simplifies the technological process, decreases energy consumption, synthesis duration and therefore requires less financial expenses. In the implementation of this work, modern instrumental methods were used: electronic and optical microscopy, X-ray structural and granulometric analysis, UV, IR, and Raman spectroscopy.

Keywords: ceramic materials, α-Al₂O₃, carbon nanostructures, composites, characterization, hot-pressing

Procedia PDF Downloads 119
281 Electret: A Solution of Partial Discharge in High Voltage Applications

Authors: Farhina Haque, Chanyeop Park

Abstract:

The high efficiency, high field, and high power density provided by wide bandgap (WBG) semiconductors and advanced power electronic converter (PEC) topologies enabled the dynamic control of power in medium to high voltage systems. Although WBG semiconductors outperform the conventional Silicon based devices in terms of voltage rating, switching speed, and efficiency, the increased voltage handling properties, high dv/dt, and compact device packaging increase local electric fields, which are the main causes of partial discharge (PD) in the advanced medium and high voltage applications. PD, which occurs actively in voids, triple points, and airgaps, is an inevitable dielectric challenge that causes insulation and device aging. The aging process accelerates over time and eventually leads to the complete failure of the applications. Hence, it is critical to mitigating PD. Sharp edges, airgaps, triple points, and bubbles are common defects that exist in any medium to high voltage device. The defects are created during the manufacturing processes of the devices and are prone to high-electric-field-induced PD due to the low permittivity and low breakdown strength of the gaseous medium filling the defects. A contemporary approach of mitigating PD by neutralizing electric fields in high power density applications is introduced in this study. To neutralize the locally enhanced electric fields that occur around the triple points, airgaps, sharp edges, and bubbles, electrets are developed and incorporated into high voltage applications. Electrets are electric fields emitting dielectric materials that are embedded with electrical charges on the surface and in bulk. In this study, electrets are fabricated by electrically charging polyvinylidene difluoride (PVDF) films based on the widely used triode corona discharge method. To investigate the PD mitigation performance of the fabricated electret films, a series of PD experiments are conducted on both the charged and uncharged PVDF films under square voltage stimuli that represent PWM waveform. In addition to the use of single layer electrets, multiple layers of electrets are also experimented with to mitigate PD caused by higher system voltages. The electret-based approach shows great promise in mitigating PD by neutralizing the local electric field. The results of the PD measurements suggest that the development of an ultimate solution to the decades-long dielectric challenge would be possible with further developments in the fabrication process of electrets.

Keywords: electrets, high power density, partial discharge, triode corona discharge

Procedia PDF Downloads 203
280 A Hybrid Artificial Intelligence and Two Dimensional Depth Averaged Numerical Model for Solving Shallow Water and Exner Equations Simultaneously

Authors: S. Mehrab Amiri, Nasser Talebbeydokhti

Abstract:

Modeling sediment transport processes by means of numerical approach often poses severe challenges. In this way, a number of techniques have been suggested to solve flow and sediment equations in decoupled, semi-coupled or fully coupled forms. Furthermore, in order to capture flow discontinuities, a number of techniques, like artificial viscosity and shock fitting, have been proposed for solving these equations which are mostly required careful calibration processes. In this research, a numerical scheme for solving shallow water and Exner equations in fully coupled form is presented. First-Order Centered scheme is applied for producing required numerical fluxes and the reconstruction process is carried out toward using Monotonic Upstream Scheme for Conservation Laws to achieve a high order scheme.  In order to satisfy C-property of the scheme in presence of bed topography, Surface Gradient Method is proposed. Combining the presented scheme with fourth order Runge-Kutta algorithm for time integration yields a competent numerical scheme. In addition, to handle non-prismatic channels problems, Cartesian Cut Cell Method is employed. A trained Multi-Layer Perceptron Artificial Neural Network which is of Feed Forward Back Propagation (FFBP) type estimates sediment flow discharge in the model rather than usual empirical formulas. Hydrodynamic part of the model is tested for showing its capability in simulation of flow discontinuities, transcritical flows, wetting/drying conditions and non-prismatic channel flows. In this end, dam-break flow onto a locally non-prismatic converging-diverging channel with initially dry bed conditions is modeled. The morphodynamic part of the model is verified simulating dam break on a dry movable bed and bed level variations in an alluvial junction. The results show that the model is capable in capturing the flow discontinuities, solving wetting/drying problems even in non-prismatic channels and presenting proper results for movable bed situations. It can also be deducted that applying Artificial Neural Network, instead of common empirical formulas for estimating sediment flow discharge, leads to more accurate results.

Keywords: artificial neural network, morphodynamic model, sediment continuity equation, shallow water equations

Procedia PDF Downloads 187
279 Enhancing Large Language Models' Data Analysis Capability with Planning-and-Execution and Code Generation Agents: A Use Case for Southeast Asia Real Estate Market Analytics

Authors: Kien Vu, Jien Min Soh, Mohamed Jahangir Abubacker, Piyawut Pattamanon, Soojin Lee, Suvro Banerjee

Abstract:

Recent advances in Generative Artificial Intelligence (GenAI), in particular Large Language Models (LLMs) have shown promise to disrupt multiple industries at scale. However, LLMs also present unique challenges, notably, these so-called "hallucination" which is the generation of outputs that are not grounded in the input data that hinders its adoption into production. Common practice to mitigate hallucination problem is utilizing Retrieval Agmented Generation (RAG) system to ground LLMs'response to ground truth. RAG converts the grounding documents into embeddings, retrieve the relevant parts with vector similarity between user's query and documents, then generates a response that is not only based on its pre-trained knowledge but also on the specific information from the retrieved documents. However, the RAG system is not suitable for tabular data and subsequent data analysis tasks due to multiple reasons such as information loss, data format, and retrieval mechanism. In this study, we have explored a novel methodology that combines planning-and-execution and code generation agents to enhance LLMs' data analysis capabilities. The approach enables LLMs to autonomously dissect a complex analytical task into simpler sub-tasks and requirements, then convert them into executable segments of code. In the final step, it generates the complete response from output of the executed code. When deployed beta version on DataSense, the property insight tool of PropertyGuru, the approach yielded promising results, as it was able to provide market insights and data visualization needs with high accuracy and extensive coverage by abstracting the complexities for real-estate agents and developers from non-programming background. In essence, the methodology not only refines the analytical process but also serves as a strategic tool for real estate professionals, aiding in market understanding and enhancement without the need for programming skills. The implication extends beyond immediate analytics, paving the way for a new era in the real estate industry characterized by efficiency and advanced data utilization.

Keywords: large language model, reasoning, planning and execution, code generation, natural language processing, prompt engineering, data analysis, real estate, data sense, PropertyGuru

Procedia PDF Downloads 87
278 Exploring Instructional Designs on the Socio-Scientific Issues-Based Learning Method in Respect to STEM Education for Measuring Reasonable Ethics on Electromagnetic Wave through Science Attitudes toward Physics

Authors: Adisorn Banhan, Toansakul Santiboon, Prasong Saihong

Abstract:

Using the Socio-Scientific Issues-Based Learning Method is to compare of the blended instruction of STEM education with a sample consisted of 84 students in 2 classes at the 11th grade level in Sarakham Pittayakhom School. The 2-instructional models were managed of five instructional lesson plans in the context of electronic wave issue. These research procedures were designed of each instructional method through two groups, the 40-experimental student group was designed for the instructional STEM education (STEMe) and 40-controlling student group was administered with the Socio-Scientific Issues-Based Learning (SSIBL) methods. Associations between students’ learning achievements of each instructional method and their science attitudes of their predictions to their exploring activities toward physics with the STEMe and SSIBL methods were compared. The Measuring Reasonable Ethics Test (MRET) was assessed students’ reasonable ethics with the STEMe and SSIBL instructional design methods on two each group. Using the pretest and posttest technique to monitor and evaluate students’ performances of their reasonable ethics on electromagnetic wave issue in the STEMe and SSIBL instructional classes were examined. Students were observed and gained experience with the phenomena being studied with the Socio-Scientific Issues-Based Learning method Model. To support with the STEM that it was not just teaching about Science, Technology, Engineering, and Mathematics; it is a culture that needs to be cultivated to help create a problem solving, creative, critical thinking workforce for tomorrow in physics. Students’ attitudes were assessed with the Test Of Physics-Related Attitude (TOPRA) modified from the original Test Of Science-Related Attitude (TOSRA). Comparisons between students’ learning achievements of their different instructional methods on the STEMe and SSIBL were analyzed. Associations between students’ performances the STEMe and SSIBL instructional design methods of their reasonable ethics and their science attitudes toward physics were associated. These findings have found that the efficiency of the SSIBL and the STEMe innovations were based on criteria of the IOC value higher than evidence as 80/80 standard level. Statistically significant of students’ learning achievements to their later outcomes on the controlling and experimental groups with the SSIBL and STEMe were differentiated between students’ learning achievements at the .05 level. To compare between students’ reasonable ethics with the SSIBL and STEMe of students’ responses to their instructional activities in the STEMe is higher than the SSIBL instructional methods. Associations between students’ later learning achievements with the SSIBL and STEMe, the predictive efficiency values of the R2 indicate that 67% and 75% for the SSIBL, and indicate that 74% and 81% for the STEMe of the variances were attributable to their developing reasonable ethics and science attitudes toward physics, consequently.

Keywords: socio-scientific issues-based learning method, STEM education, science attitudes, measurement, reasonable ethics, physics classes

Procedia PDF Downloads 292
277 Servitization in Machine and Plant Engineering: Leveraging Generative AI for Effective Product Portfolio Management Amidst Disruptive Innovations

Authors: Till Gramberg

Abstract:

In the dynamic world of machine and plant engineering, stagnation in the growth of new product sales compels companies to reconsider their business models. The increasing shift toward service orientation, known as "servitization," along with challenges posed by digitalization and sustainability, necessitates an adaptation of product portfolio management (PPM). Against this backdrop, this study investigates the current challenges and requirements of PPM in this industrial context and develops a framework for the application of generative artificial intelligence (AI) to enhance agility and efficiency in PPM processes. The research approach of this study is based on a mixed-method design. Initially, qualitative interviews with industry experts were conducted to gain a deep understanding of the specific challenges and requirements in PPM. These interviews were analyzed using the Gioia method, painting a detailed picture of the existing issues and needs within the sector. This was complemented by a quantitative online survey. The combination of qualitative and quantitative research enabled a comprehensive understanding of the current challenges in the practical application of machine and plant engineering PPM. Based on these insights, a specific framework for the application of generative AI in PPM was developed. This framework aims to assist companies in implementing faster and more agile processes, systematically integrating dynamic requirements from trends such as digitalization and sustainability into their PPM process. Utilizing generative AI technologies, companies can more quickly identify and respond to trends and market changes, allowing for a more efficient and targeted adaptation of the product portfolio. The study emphasizes the importance of an agile and reactive approach to PPM in a rapidly changing environment. It demonstrates how generative AI can serve as a powerful tool to manage the complexity of a diversified and continually evolving product portfolio. The developed framework offers practical guidelines and strategies for companies to improve their PPM processes by leveraging the latest technological advancements while maintaining ecological and social responsibility. This paper significantly contributes to deepening the understanding of the application of generative AI in PPM and provides a framework for companies to manage their product portfolios more effectively and adapt to changing market conditions. The findings underscore the relevance of continuous adaptation and innovation in PPM strategies and demonstrate the potential of generative AI for proactive and future-oriented business management.

Keywords: servitization, product portfolio management, generative AI, disruptive innovation, machine and plant engineering

Procedia PDF Downloads 82
276 Investigations of Effective Marketing Metric Strategies: The Case of St. George Brewery Factory, Ethiopia

Authors: Mekdes Getu Chekol, Biniam Tedros Kahsay, Rahwa Berihu Haile

Abstract:

The main objective of this study is to investigate the marketing strategy practice in the Case of St. George Brewery Factory in Addis Ababa. One of the core activities in a Business Company to stay in business is having a well-developed marketing strategy. It assessed how the marketing strategies were practiced in the company to achieve its goals aligned with segmentation, target market, positioning, and the marketing mix elements to satisfy customer requirements. Using primary and secondary data, the study is conducted by using both qualitative and quantitative approaches. The primary data was collected through open and closed-ended questionnaires. Considering the size of the population is small, the selection of the respondents was carried out by using a census. The finding shows that the company used all the 4 Ps of the marketing mix elements in its marketing strategies and provided quality products at affordable prices by promoting its products by using high and effective advertising mechanisms. The product availability and accessibility are admirable with the practices of both direct and indirect distribution channels. On the other hand, the company has identified its target customers, and the company’s market segmentation practice is geographical location. Communication effectiveness between the marketing department and other departments is very good. The adjusted R2 model explains 61.6% of the marketing strategy practice variance by product, price, promotion, and place. The remaining 38.4% of variation in the dependent variable was explained by other factors not included in this study. The result reveals that all four independent variables, product, price, promotion, and place, have a positive beta sign, proving that predictor variables have a positive effect on that of the predicting dependent variable marketing strategy practice. Even though the marketing strategies of the company are effectively practiced, there are some problems that the company faces while implementing them. These are infrastructure problems, economic problems, intensive competition in the market, shortage of raw materials, seasonality of consumption, socio-cultural problems, and the time and cost of awareness creation for the customers. Finally, the authors suggest that the company better develop a long-range view and try to implement a more structured approach to attain information about potential customers, competitor’s actions, and market intelligence within the industry. In addition, we recommend conducting the study by increasing the sample size and including different marketing factors.

Keywords: marketing strategy, market segmentation, target marketing, market positioning, marketing mix

Procedia PDF Downloads 60