Search results for: the creative learning process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 21266

Search results for: the creative learning process

16706 Impact of Instagram Food Bloggers on Consumer (Generation Z) Decision Making Process in Islamabad. Pakistan

Authors: Tabinda Sadiq, Tehmina Ashfaq Qazi, Hoor Shumail

Abstract:

Recently, the advent of emerging technology has created an emerging generation of restaurant marketing. It explores the aspects that influence customers’ decision-making process in selecting a restaurant after reading food bloggers' reviews online. The motivation behind this research is to investigate the correlation between the credibility of the source and their attitude toward restaurant visits. The researcher collected the data by distributing a survey questionnaire through google forms by employing the Source credibility theory. Non- probability purposive sampling technique was used to collect data. The questionnaire used a predeveloped and validated scale by Ohanian to measure the relationship. Also, the researcher collected data from 250 respondents in order to investigate the influence of food bloggers on Gen Z's decision-making process. SPSS statistical version 26 was used for statistical testing and analyzing the data. The findings of the survey revealed that there is a moderate positive correlation between the variables. So, it can be analyzed that food bloggers do have an impact on Generation Z's decision making process.

Keywords: credibility, decision making, food bloggers, generation z, e-wom

Procedia PDF Downloads 78
16705 Analyzing Data Protection in the Era of Big Data under the Framework of Virtual Property Layer Theory

Authors: Xiaochen Mu

Abstract:

Data rights confirmation, as a key legal issue in the development of the digital economy, is undergoing a transition from a traditional rights paradigm to a more complex private-economic paradigm. In this process, data rights confirmation has evolved from a simple claim of rights to a complex structure encompassing multiple dimensions of personality rights and property rights. Current data rights confirmation practices are primarily reflected in two models: holistic rights confirmation and process rights confirmation. The holistic rights confirmation model continues the traditional "one object, one right" theory, while the process rights confirmation model, through contractual relationships in the data processing process, recognizes rights that are more adaptable to the needs of data circulation and value release. In the design of the data property rights system, there is a hierarchical characteristic aimed at decoupling from raw data to data applications through horizontal stratification and vertical staging. This design not only respects the ownership rights of data originators but also, based on the usufructuary rights of enterprises, constructs a corresponding rights system for different stages of data processing activities. The subjects of data property rights include both data originators, such as users, and data producers, such as enterprises, who enjoy different rights at different stages of data processing. The intellectual property rights system, with the mission of incentivizing innovation and promoting the advancement of science, culture, and the arts, provides a complete set of mechanisms for protecting innovative results. However, unlike traditional private property rights, the granting of intellectual property rights is not an end in itself; the purpose of the intellectual property system is to balance the exclusive rights of the rights holders with the prosperity and long-term development of society's public learning and the entire field of science, culture, and the arts. Therefore, the intellectual property granting mechanism provides both protection and limitations for the rights holder. This perfectly aligns with the dual attributes of data. In terms of achieving the protection of data property rights, the granting of intellectual property rights is an important institutional choice that can enhance the effectiveness of the data property exchange mechanism. Although this is not the only path, the granting of data property rights within the framework of the intellectual property rights system helps to establish fundamental legal relationships and rights confirmation mechanisms and is more compatible with the classification and grading system of data. The modernity of the intellectual property rights system allows it to adapt to the needs of big data technology development through special clauses or industry guidelines, thus promoting the comprehensive advancement of data intellectual property rights legislation. This paper analyzes data protection under the virtual property layer theory and two-fold virtual property rights system. Based on the “bundle of right” theory, this paper establishes specific three-level data rights. This paper analyzes the cases: Google v. Vidal-Hall, Halliday v Creation Consumer Finance, Douglas v Hello Limited, Campbell v MGN and Imerman v Tchenquiz. This paper concluded that recognizing property rights over personal data and protecting data under the framework of intellectual property will be beneficial to establish the tort of misuse of personal information.

Keywords: data protection, property rights, intellectual property, Big data

Procedia PDF Downloads 45
16704 Character and Evolution of Electronic Waste: A Technologically Developing Country's Experience

Authors: Karen C. Olufokunbi, Odetunji A. Odejobi

Abstract:

The discourse of this paper is the examination of the generation, accumulation and growth of e-waste in a developing country. Images and other data about computer e-waste were collected using a digital camera, 290 copies of questionnaire and three structured interviews using Obafemi Awolowo University (OAU), Ile-Ife, Nigeria environment as a case study. The numerical data were analysed using R data analysis and process tool. Automata-based techniques and Petri net modeling tool were used to design and simulate a computational model for the recovery of saleable materials from e-waste. The R analysis showed that at a 95 percent confidence level, the computer equipment that will be disposed by 2020 will be 417 units. Compared to the 800 units in circulation in 2014, 50 percent of personal computer components will become e-waste. This indicates that personal computer components were in high demand due to their low costs and will be disposed more rapidly when replaced by new computer equipment Also, 57 percent of the respondents discarded their computer e-waste by throwing it into the garbage bin or by dumping it. The simulated model using Coloured Petri net modelling tool for the process showed that the e-waste dynamics is a forward sequential process in the form of a pipeline meaning that an e-waste recovery of saleable materials process occurs in identifiable discrete stages indicating that e-waste will continue to accumulate and grow in volume with time.

Keywords: Coloured Petri net, computational modelling, electronic waste, electronic waste process dynamics

Procedia PDF Downloads 169
16703 Design of Process Parameters in Electromagnetic Forming Apparatus by FEM

Authors: Hyeong-Gyu Park, Hak-Gon Noh, Beom-Soo Kang, Jeong Kim

Abstract:

Electromagnetic forming (EMF) process is one of a high-speed forming process, which uses an electromagnetic body (Lorentz) force to deform work-piece. Advantages of EMF are summarized as improvement of formability, reduction in wrinkling, non-contact forming. In this study, the spiral coil is considered to evaluate formability in terms of pressure distribution of the forming process. It also is represented forming results of numerical analysis using ANSYS code. In the numerical simulation, RLC circuit coupled with spiral coil was made to consider the design parameters such as system input current and electromagnetic force. The simulation results show that even though input peak currents level are same level in each case, forming condition is certainly different because of frequency of input current and magnitude of current density and magnetic flux density. Finally, the simulation results appear that electromagnetic forming force apparently affected by input current frequency which determines magnitude of current density and magnetic flux density.

Keywords: electromagnetic forming, high-speed forming, RLC circuit, Lorentz force

Procedia PDF Downloads 459
16702 Learning, Teaching and Assessing Students’ ESP Skills via Exe and Hot Potatoes Software Programs

Authors: Naira Poghosyan

Abstract:

In knowledge society the content of the studies, the methods used and the requirements for an educator’s professionalism regularly undergo certain changes. It follows that in knowledge society the aim of education is not only to educate professionals for a certain field but also to help students to be aware of cultural values, form human mutual relationship, collaborate, be open, adapt to the new situation, creatively express their ideas, accept responsibility and challenge. In this viewpoint, the development of communicative language competence requires a through coordinated approach to ensure proper comprehension and memorization of subject-specific words starting from high school level. On the other hand, ESP (English for Specific Purposes) teachers and practitioners are increasingly faced with the task of developing and exploiting new ways of assessing their learners’ literacy while learning and teaching ESP. The presentation will highlight the latest achievements in this field. The author will present some practical methodological issues and principles associated with learning, teaching and assessing ESP skills of the learners, using the two software programs of EXE 2.0 and Hot Potatoes 6. On the one hand the author will display the advantages of the two programs as self-learning and self-assessment interactive tools in the course of academic study and professional development of the CLIL learners, on the other hand, she will comprehensively shed light upon some methodological aspects of working out appropriate ways of selection, introduction, consolidation of subject specific materials via EXE 2.0 and Hot Potatoes 6. Then the author will go further to distinguish ESP courses by the general nature of the learners’ specialty identifying three large categories of EST (English for Science and Technology), EBE (English for Business and Economics) and ESS (English for the Social Sciences). The cornerstone of the presentation will be the introduction of the subject titled “The methodology of teaching ESP in non-linguistic institutions”, where a unique case of teaching ESP on Architecture and Construction via EXE 2.0 and Hot Potatoes 6 will be introduced, exemplifying how the introduction, consolidation and assessment can be used as a basis for feedback to the ESP learners in a particular professional field.

Keywords: ESP competences, ESP skill assessment/ self-assessment tool, eXe 2.0 / HotPotatoes software program, ESP teaching strategies and techniques

Procedia PDF Downloads 379
16701 Influence of Decolourisation Condition on the Physicochemical Properties of Shea (Vitellaria paradoxa Gaertner F) Butter

Authors: Ahmed Mohammed Mohagir, Ahmat-Charfadine Mahamat, Nde Divine Bup, Richard Kamga, César Kapseu

Abstract:

In this investigation, kinetics studies of adsorption of colour material of shea butter showed a peak at the wavelength 440 nm and the equilibrium time was found to be 30 min. Response surface methodology applying Doehlert experimental design was used to investigate decolourisation parameters of crude shea butter. The decolourisation process was significantly influenced by three independent parameters: contact time, decolourisation temperature and adsorbent dose. The responses of the process were oil loss, acid value, peroxide value and colour index. Response surface plots were successfully made to visualise the effect of the independent parameters on the responses of the process.

Keywords: decolourisation, doehlert experimental design, physicochemical characterisation, RSM, shea butter

Procedia PDF Downloads 421
16700 Peace through Environmental Stewardship

Authors: Elizabeth D. Ramos

Abstract:

Peace education supports a holistic appreciation for the value of life and the interdependence of all living systems. Peace education aims to build a culture of peace. One way of building a culture of peace is through environmental stewardship. This study sought to find out the environmental stewardship practices in selected Higher Education Institutions (HEIs) in the Philippines and how these environmental stewardship practices lead to building a culture of peace. The findings revealed that there is still room for improvement in implementing environmental stewardship in schools through academic service learning. In addition, the following manifestations are implemented very satisfactorily in schools: 1) waste reduction, reuse, and recycling, 2) community service, 3) clean and green surroundings. Administrators of schools in the study lead their staff and students in implementing environmental stewardship. It could be concluded that those involved in environmental stewardship display an acceptable culture of peace, particularly, solidarity, respect for persons, and inner peace.

Keywords: academic service learning, environmental stewardship, leadership support, peace, solidarity

Procedia PDF Downloads 510
16699 Machine Learning Techniques in Seismic Risk Assessment of Structures

Authors: Farid Khosravikia, Patricia Clayton

Abstract:

The main objective of this work is to evaluate the advantages and disadvantages of various machine learning techniques in two key steps of seismic hazard and risk assessment of different types of structures. The first step is the development of ground-motion models, which are used for forecasting ground-motion intensity measures (IM) given source characteristics, source-to-site distance, and local site condition for future events. IMs such as peak ground acceleration and velocity (PGA and PGV, respectively) as well as 5% damped elastic pseudospectral accelerations at different periods (PSA), are indicators of the strength of shaking at the ground surface. Typically, linear regression-based models, with pre-defined equations and coefficients, are used in ground motion prediction. However, due to the restrictions of the linear regression methods, such models may not capture more complex nonlinear behaviors that exist in the data. Thus, this study comparatively investigates potential benefits from employing other machine learning techniques as statistical method in ground motion prediction such as Artificial Neural Network, Random Forest, and Support Vector Machine. The results indicate the algorithms satisfy some physically sound characteristics such as magnitude scaling distance dependency without requiring pre-defined equations or coefficients. Moreover, it is shown that, when sufficient data is available, all the alternative algorithms tend to provide more accurate estimates compared to the conventional linear regression-based method, and particularly, Random Forest outperforms the other algorithms. However, the conventional method is a better tool when limited data is available. Second, it is investigated how machine learning techniques could be beneficial for developing probabilistic seismic demand models (PSDMs), which provide the relationship between the structural demand responses (e.g., component deformations, accelerations, internal forces, etc.) and the ground motion IMs. In the risk framework, such models are used to develop fragility curves estimating exceeding probability of damage for pre-defined limit states, and therefore, control the reliability of the predictions in the risk assessment. In this study, machine learning algorithms like artificial neural network, random forest, and support vector machine are adopted and trained on the demand parameters to derive PSDMs for them. It is observed that such models can provide more accurate estimates of prediction in relatively shorter about of time compared to conventional methods. Moreover, they can be used for sensitivity analysis of fragility curves with respect to many modeling parameters without necessarily requiring more intense numerical response-history analysis.

Keywords: artificial neural network, machine learning, random forest, seismic risk analysis, seismic hazard analysis, support vector machine

Procedia PDF Downloads 110
16698 Performing Fat Activism in Australia: An Autoethnographic Exploration

Authors: Jenny Lee

Abstract:

Fat Studies is emerging as an interdisciplinary area of study, intersecting with Gender Studies, Sociology, Human Development and the Creative Arts. A focus on weight loss, and, therefore, fat hatred, has resulted in a form of discriminatory institutional practice that impacts women in the Western world. This focus is sanctioned by a large dieting industry, medical associations, the media, and at times, government initiatives. This paper will discuss the emergence of the so-called ‘Obesity Epidemic’ in Australia and the Western world and the stereotypes that thin equals healthy and fat equals unhealthy. This paper will argue that, for those with a health focus, ‘Health at every size’ is a more effective principle, which involves striving for healthy living, without a focus on weight loss. This discussion will contextualise an autoethnographic exploration of how fat acceptance and Health at Every Size can be encouraged through fat activism and fat political art. As part of this paper, a selection of the recent performance, writing and art in Australia will be presented, including Aquaporko, the fat femme synchronised swim team and VaVaBoomBah, the Melbourne fat burlesque performances.

Keywords: activism, fat, health, obesity, performance

Procedia PDF Downloads 185
16697 Preparation of Papers - Developing a Leukemia Diagnostic System Based on Hybrid Deep Learning Architectures in Actual Clinical Environments

Authors: Skyler Kim

Abstract:

An early diagnosis of leukemia has always been a challenge to doctors and hematologists. On a worldwide basis, it was reported that there were approximately 350,000 new cases in 2012, and diagnosing leukemia was time-consuming and inefficient because of an endemic shortage of flow cytometry equipment in current clinical practice. As the number of medical diagnosis tools increased and a large volume of high-quality data was produced, there was an urgent need for more advanced data analysis methods. One of these methods was the AI approach. This approach has become a major trend in recent years, and several research groups have been working on developing these diagnostic models. However, designing and implementing a leukemia diagnostic system in real clinical environments based on a deep learning approach with larger sets remains complex. Leukemia is a major hematological malignancy that results in mortality and morbidity throughout different ages. We decided to select acute lymphocytic leukemia to develop our diagnostic system since acute lymphocytic leukemia is the most common type of leukemia, accounting for 74% of all children diagnosed with leukemia. The results from this development work can be applied to all other types of leukemia. To develop our model, the Kaggle dataset was used, which consists of 15135 total images, 8491 of these are images of abnormal cells, and 5398 images are normal. In this paper, we design and implement a leukemia diagnostic system in a real clinical environment based on deep learning approaches with larger sets. The proposed diagnostic system has the function of detecting and classifying leukemia. Different from other AI approaches, we explore hybrid architectures to improve the current performance. First, we developed two independent convolutional neural network models: VGG19 and ResNet50. Then, using both VGG19 and ResNet50, we developed a hybrid deep learning architecture employing transfer learning techniques to extract features from each input image. In our approach, fusing the features from specific abstraction layers can be deemed as auxiliary features and lead to further improvement of the classification accuracy. In this approach, features extracted from the lower levels are combined into higher dimension feature maps to help improve the discriminative capability of intermediate features and also overcome the problem of network gradient vanishing or exploding. By comparing VGG19 and ResNet50 and the proposed hybrid model, we concluded that the hybrid model had a significant advantage in accuracy. The detailed results of each model’s performance and their pros and cons will be presented in the conference.

Keywords: acute lymphoblastic leukemia, hybrid model, leukemia diagnostic system, machine learning

Procedia PDF Downloads 191
16696 Facial Emotion Recognition with Convolutional Neural Network Based Architecture

Authors: Koray U. Erbas

Abstract:

Neural networks are appealing for many applications since they are able to learn complex non-linear relationships between input and output data. As the number of neurons and layers in a neural network increase, it is possible to represent more complex relationships with automatically extracted features. Nowadays Deep Neural Networks (DNNs) are widely used in Computer Vision problems such as; classification, object detection, segmentation image editing etc. In this work, Facial Emotion Recognition task is performed by proposed Convolutional Neural Network (CNN)-based DNN architecture using FER2013 Dataset. Moreover, the effects of different hyperparameters (activation function, kernel size, initializer, batch size and network size) are investigated and ablation study results for Pooling Layer, Dropout and Batch Normalization are presented.

Keywords: convolutional neural network, deep learning, deep learning based FER, facial emotion recognition

Procedia PDF Downloads 282
16695 Bioinformatics High Performance Computation and Big Data

Authors: Javed Mohammed

Abstract:

Right now, bio-medical infrastructure lags well behind the curve. Our healthcare system is dispersed and disjointed; medical records are a bit of a mess; and we do not yet have the capacity to store and process the crazy amounts of data coming our way from widespread whole-genome sequencing. And then there are privacy issues. Despite these infrastructure challenges, some researchers are plunging into bio medical Big Data now, in hopes of extracting new and actionable knowledge. They are doing delving into molecular-level data to discover bio markers that help classify patients based on their response to existing treatments; and pushing their results out to physicians in novel and creative ways. Computer scientists and bio medical researchers are able to transform data into models and simulations that will enable scientists for the first time to gain a profound under-standing of the deepest biological functions. Solving biological problems may require High-Performance Computing HPC due either to the massive parallel computation required to solve a particular problem or to algorithmic complexity that may range from difficult to intractable. Many problems involve seemingly well-behaved polynomial time algorithms (such as all-to-all comparisons) but have massive computational requirements due to the large data sets that must be analyzed. High-throughput techniques for DNA sequencing and analysis of gene expression have led to exponential growth in the amount of publicly available genomic data. With the increased availability of genomic data traditional database approaches are no longer sufficient for rapidly performing life science queries involving the fusion of data types. Computing systems are now so powerful it is possible for researchers to consider modeling the folding of a protein or even the simulation of an entire human body. This research paper emphasizes the computational biology's growing need for high-performance computing and Big Data. It illustrates this article’s indispensability in meeting the scientific and engineering challenges of the twenty-first century, and how Protein Folding (the structure and function of proteins) and Phylogeny Reconstruction (evolutionary history of a group of genes) can use HPC that provides sufficient capability for evaluating or solving more limited but meaningful instances. This article also indicates solutions to optimization problems, and benefits Big Data and Computational Biology. The article illustrates the Current State-of-the-Art and Future-Generation Biology of HPC Computing with Big Data.

Keywords: high performance, big data, parallel computation, molecular data, computational biology

Procedia PDF Downloads 367
16694 Selecting Graduates for the Interns’ Award by Using Multisource Feedback Process: Does It Work?

Authors: Kathryn Strachan, Sameer Otoom, Amal AL-Gallaf, Ahmed Al Ansari

Abstract:

Introduction: Introducing a reliable method to select graduates for an award in higher education can be challenging but is not impossible. Multisource feedback (MSF) is a popular assessment tool that relies on evaluations of different groups of people, including physicians and non-physicians. It is useful for assessing several domains, including professionalism, communication and collaboration and may be useful for selecting the best interns to receive a University award. Methods: 16 graduates responded to an invitation to participate in the student award, which was conducted by the Royal College of Surgeons of Ireland-Bahrain Medical University of Bahrain (RCSI Bahrain) using the MSF process. Five individuals from the following categories rated each participant: physicians, nurses, and fellow students. RCSI Bahrain graduates were assessed in the following domains; professionalism, communication, and collaboration. Mean and standard deviation were calculated and the award was given to the graduate who scored the highest among his/her colleagues. Cronbach’s coefficient was used to determine the questionnaire’s internal consistency and reliability. Factor analysis was conducted to examine for the construct validity. Results: 16 graduates participated in the RCSI-Bahrain interns’ award based on the MSF process, giving us a 16.5% response rate. The instrument was found to be suitable for factor analysis and showed 3 factor solutions representing 79.3% of the total variance. Reliability analysis using Cronbach’s α reliability of internal consistency indicated that the full scale of the instrument had high internal consistency (Cronbach’s α 0.98). Conclusion: This study found the MSF process to be reliable and valid for selecting the best graduates for the interns’ awards. However, the low response rates may suggest that the process is not feasible for allowing the majority of the students to participate in the selection process. Further research studies may be required to support the feasibility of the MSF process in selecting graduates for the university award.

Keywords: MSF, RCSI, validity, Bahrain

Procedia PDF Downloads 345
16693 The Effects of Normal Aging on Reasoning Ability: A Dual-Process Approach

Authors: Jamie A. Prowse Turner, Jamie I. D. Campbell, Valerie A. Thompson

Abstract:

The objective of the current research was to use a dual-process theory framework to explain these age-related differences in reasoning. Seventy-two older (M = 80.0 years) and 72 younger (M = 24.6 years) adults were given a variety of reasoning tests (i.e., a syllogistic task, base rate task, the Cognitive Reflection Test, and a perspective manipulation), as well as independent tests of capacity (working memory, processing speed, and inhibition), thinking styles, and metacognitive ability, to account for these age-related differences. It was revealed that age-related differences were limited to problems that required Type 2 processing and were related to differences in cognitive capacity, individual difference factors, and strategy choice. Furthermore, older adults’ performance can be improved by reasoning from another’s’ perspective and cannot, at this time, be explained by metacognitive differences between young and older adults. All of these findings fit well within a dual-process theory of reasoning, which provides an integrative framework accounting for previous findings and the findings presented in the current manuscript.

Keywords: aging, dual-process theory, performance, reasoning ability

Procedia PDF Downloads 195
16692 The Fluid Limit of the Critical Processor Sharing Tandem Queue

Authors: Amal Ezzidani, Abdelghani Ben Tahar, Mohamed Hanini

Abstract:

A sequence of finite tandem queue is considered for this study. Each one has a single server, which operates under the egalitarian processor sharing discipline. External customers arrive at each queue according to a renewal input process and having a general service times distribution. Upon completing service, customers leave the current queue and enter to the next. Under mild assumptions, including critical data, we prove the existence and the uniqueness of the fluid solution. For asymptotic behavior, we provide necessary and sufficient conditions for the invariant state and the convergence to this invariant state. In the end, we establish the convergence of a correctly normalized state process to a fluid limit characterized by a system of algebraic and integral equations.

Keywords: fluid limit, fluid model, measure valued process, processor sharing, tandem queue

Procedia PDF Downloads 330
16691 Material and Parameter Analysis of the PolyJet Process for Mold Making Using Design of Experiments

Authors: A. Kampker, K. Kreisköther, C. Reinders

Abstract:

Since additive manufacturing technologies constantly advance, the use of this technology in mold making seems reasonable. Many manufacturers of additive manufacturing machines, however, do not offer any suggestions on how to parameterize the machine to achieve optimal results for mold making. The purpose of this research is to determine the interdependencies of different materials and parameters within the PolyJet process by using design of experiments (DoE), to additively manufacture molds, e.g. for thermoforming and injection molding applications. Therefore, the general requirements of thermoforming molds, such as heat resistance, surface quality and hardness, have been identified. Then, different materials and parameters of the PolyJet process, such as the orientation of the printed part, the layer thickness, the printing mode (matte or glossy), the distance between printed parts and the scaling of parts, have been examined. The multifactorial analysis covers the following properties of the printed samples: Tensile strength, tensile modulus, bending strength, elongation at break, surface quality, heat deflection temperature and surface hardness. The key objective of this research is that by joining the results from the DoE with the requirements of the mold making, optimal and tailored molds can be additively manufactured with the PolyJet process. These additively manufactured molds can then be used in prototyping processes, in process testing and in small to medium batch production.

Keywords: additive manufacturing, design of experiments, mold making, PolyJet, 3D-Printing

Procedia PDF Downloads 259
16690 Investigation of Optimized Mechanical Properties on Friction Stir Welded Al6063 Alloy

Authors: Lingaraju Dumpala, Narasa Raju Gosangi

Abstract:

Friction Stir Welding (FSW) is relatively new, environmentally friendly, versatile, and widely used joining technique for soft materials such as aluminum. FSW has got a lot of attention as a solid-state joining method which avoids many common problems of fusion welding and provides an improved way of producing aluminum joints in a faster way. FSW can be used for various aerospace, defense, automotive and transportation applications. It is necessary to understand the friction stir welded joints and its characteristics to use this new joining technique in critical applications. This study investigated the mechanical properties of friction stir welded aluminum 6063 alloys. FSW is carried out based on the design of experiments using L16 mixed level array by considering tool rotational speeds, tool feed rate and tool tilt angles as process parameters. The optimization of process parameters is carried by Taguchi based regression analysis and the significance of process parameters is analyzed using ANOVA. It is observed that the considered process parameters are high influences the mechanical properties of Al6063.

Keywords: FSW, aluminum alloy, mechanical properties, optimization, Taguchi, ANOVA

Procedia PDF Downloads 137
16689 Investigating the Potential of a Blended Format for the Academic Reading Module Course Redesign

Authors: Reham Niazi, Marwa Helmy, Susanne Rizzo

Abstract:

This classroom action research is designed to explore the possibility of adding effective online content to supplement and add learning value to the current reading module. The aim of this research was two-fold, first to investigate students’ acceptance of and interactivity with online components, chosen to orient students with the content, and to pave the way for more in-class activities and skill practice. Secondly, the instructor aimed to examine students’ willingness to have the course contact hours remain the same with some online components to be done at home (flipped approach) or if students were open to turn the class into a blended format with two scenarios; either to have the current contact hours and apply the blended and in this case the face to face component will be less or keep the number of face to face classes the same and add more online structured classes as part of the course hours.

Keywords: blended learning, flipped classroom, graduate students, education

Procedia PDF Downloads 191
16688 Soybean Seed Composition Prediction From Standing Crops Using Planet Scope Satellite Imagery and Machine Learning

Authors: Supria Sarkar, Vasit Sagan, Sourav Bhadra, Meghnath Pokharel, Felix B.Fritschi

Abstract:

Soybean and their derivatives are very important agricultural commodities around the world because of their wide applicability in human food, animal feed, biofuel, and industries. However, the significance of soybean production depends on the quality of the soybean seeds rather than the yield alone. Seed composition is widely dependent on plant physiological properties, aerobic and anaerobic environmental conditions, nutrient content, and plant phenological characteristics, which can be captured by high temporal resolution remote sensing datasets. Planet scope (PS) satellite images have high potential in sequential information of crop growth due to their frequent revisit throughout the world. In this study, we estimate soybean seed composition while the plants are in the field by utilizing PlanetScope (PS) satellite images and different machine learning algorithms. Several experimental fields were established with varying genotypes and different seed compositions were measured from the samples as ground truth data. The PS images were processed to extract 462 hand-crafted vegetative and textural features. Four machine learning algorithms, i.e., partial least squares (PLSR), random forest (RFR), gradient boosting machine (GBM), support vector machine (SVM), and two recurrent neural network architectures, i.e., long short-term memory (LSTM) and gated recurrent unit (GRU) were used in this study to predict oil, protein, sucrose, ash, starch, and fiber of soybean seed samples. The GRU and LSTM architectures had two separate branches, one for vegetative features and the other for textures features, which were later concatenated together to predict seed composition. The results show that sucrose, ash, protein, and oil yielded comparable prediction results. Machine learning algorithms that best predicted the six seed composition traits differed. GRU worked well for oil (R-Squared: of 0.53) and protein (R-Squared: 0.36), whereas SVR and PLSR showed the best result for sucrose (R-Squared: 0.74) and ash (R-Squared: 0.60), respectively. Although, the RFR and GBM provided comparable performance, the models tended to extremely overfit. Among the features, vegetative features were found as the most important variables compared to texture features. It is suggested to utilize many vegetation indices for machine learning training and select the best ones by using feature selection methods. Overall, the study reveals the feasibility and efficiency of PS images and machine learning for plot-level seed composition estimation. However, special care should be given while designing the plot size in the experiments to avoid mixed pixel issues.

Keywords: agriculture, computer vision, data science, geospatial technology

Procedia PDF Downloads 141
16687 Selective Attention as a Search for the Deceased during the Mourning Process

Authors: Sonia Sirtoli Färber

Abstract:

Objective: This study aims to investigate selective attention in the process of mourning, as a normal reaction to loss. Method: In order to develop this research, we used a systematic bibliographic review, following the process of investigation, cataloging, careful evaluation and synthesis of the documentation, associated with the method of thanatological hemenutics proposed by Elisabeth Kübler-Ross. Conclusion: After a significant loss, especially the death of a loved one or family member, it is normal for the mourner, motivated by absence, to have a false perception of the presence of the deceased. This phenomenon happens whenever the mourner is in the middle of the crowd, because his selective attention causes him to perceive physical characteristics, tone of voice, or feel fragrance of the perfume that the deceased possessed. Details characterizing the dead are perceived by the mourner because he seeks the presence in the absence.

Keywords: Elisabeth Kübler-Ross, mourning, selective attention, thanatology

Procedia PDF Downloads 425
16686 The Problem of Relation between Concepts Empathy and Decentration in Psychology

Authors: Elina Asriyan, Lusine Stepanyan

Abstract:

This article is devoted to the study of connection between empathy and decentration. We have discovered a positive connection between these two indicators. Empathy is a variety of emotional decentration, and due to the decentration development process. To understand the investigated phenomenon it was applied a complex approach. The recorded results state that empathy and decentralization are interconnected with each other; empathy being a type of emotional decentralization is conditioned by the formation process of decentration.

Keywords: empathy, decentration, emotional decentration, egocentricity

Procedia PDF Downloads 318
16685 Using Demonstration Method of Teaching Sewing to Improve the Skills of Form 3 Fashion Designing Students: A Case of Baworo Integrated Community Center for Employable Skills (Bicces)

Authors: Aboagye Boye Gilbert

Abstract:

Teaching and learning (Education), not only in Ghana but the whole world is regarded as the (Stepping stone) vehicle to accelerate the country’s economy, development and social growth. Basically the ingredients for human development and the country in general is Vocational and Technical education and this has been stressed in Ghana’s education system since Pre-independence. To this effect, this research seeks to determine using demonstration method of Teachings sewing to improve the skills of form 3 Fashion Designing students of Baworo Integrated Community Centre for Employable Skills. In this research, reviewed literature on opinions of other researchers and what other people have done and said on related articles or topics, analyzed the research design used, translate the data gathered in the study. The study was design to gather information from the school on how they use Teaching methods to teach sewing. The targeted respondent contacted to give assistance Consist of students from BICCES, fashion teachers and tailored garment makers. The sample size consisted of 5 teachers, 20 students and 5 tailors were selected to answer questionnaire items that were used to gather the data for the study. The study revealed that most teachers and students agreed to the fact that demonstration, teaching and learning materials had a positive attitude towards the students in learning sewing. The study recommends that there should be more mechanisms in place to serve as a guide.

Keywords: VOTEC, BECE, BICCES, SHS

Procedia PDF Downloads 80
16684 Fuzzy Logic Classification Approach for Exponential Data Set in Health Care System for Predication of Future Data

Authors: Manish Pandey, Gurinderjit Kaur, Meenu Talwar, Sachin Chauhan, Jagbir Gill

Abstract:

Health-care management systems are a unit of nice connection as a result of the supply a straightforward and fast management of all aspects relating to a patient, not essentially medical. What is more, there are unit additional and additional cases of pathologies during which diagnosing and treatment may be solely allotted by victimization medical imaging techniques. With associate ever-increasing prevalence, medical pictures area unit directly acquired in or regenerate into digital type, for his or her storage additionally as sequent retrieval and process. Data Mining is the process of extracting information from large data sets through using algorithms and Techniques drawn from the field of Statistics, Machine Learning and Data Base Management Systems. Forecasting may be a prediction of what's going to occur within the future, associated it's an unsure method. Owing to the uncertainty, the accuracy of a forecast is as vital because the outcome foretold by foretelling the freelance variables. A forecast management should be wont to establish if the accuracy of the forecast is within satisfactory limits. Fuzzy regression strategies have normally been wont to develop shopper preferences models that correlate the engineering characteristics with shopper preferences relating to a replacement product; the patron preference models offer a platform, wherever by product developers will decide the engineering characteristics so as to satisfy shopper preferences before developing the merchandise. Recent analysis shows that these fuzzy regression strategies area units normally will not to model client preferences. We tend to propose a Testing the strength of Exponential Regression Model over regression toward the mean Model.

Keywords: health-care management systems, fuzzy regression, data mining, forecasting, fuzzy membership function

Procedia PDF Downloads 282
16683 Optimization Analysis of Controlled Cooling Process for H-Shape Steam Beams

Authors: Jiin-Yuh Jang, Yu-Feng Gan

Abstract:

In order to improve the comprehensive mechanical properties of the steel, the cooling rate, and the temperature distribution must be controlled in the cooling process. A three-dimensional numerical model for the prediction of the heat transfer coefficient distribution of H-beam in the controlled cooling process was performed in order to obtain the uniform temperature distribution and minimize the maximum stress and the maximum deformation after the controlled cooling. An algorithm developed with a simplified conjugated-gradient method was used as an optimizer to optimize the heat transfer coefficient distribution. The numerical results showed that, for the case of air cooling 5 seconds followed by water cooling 6 seconds with uniform the heat transfer coefficient, the cooling rate is 15.5 (℃/s), the maximum temperature difference is 85℃, the maximum the stress is 125 MPa, and the maximum deformation is 1.280 mm. After optimize the heat transfer coefficient distribution in control cooling process with the same cooling time, the cooling rate is increased to 20.5 (℃/s), the maximum temperature difference is decreased to 52℃, the maximum stress is decreased to 82MPa and the maximum deformation is decreased to 1.167mm.

Keywords: controlled cooling, H-Beam, optimization, thermal stress

Procedia PDF Downloads 374
16682 Predictive Analysis of Chest X-rays Using NLP and Large Language Models with the Indiana University Dataset and Random Forest Classifier

Authors: Azita Ramezani, Ghazal Mashhadiagha, Bahareh Sanabakhsh

Abstract:

This study researches the combination of Random. Forest classifiers with large language models (LLMs) and natural language processing (NLP) to improve diagnostic accuracy in chest X-ray analysis using the Indiana University dataset. Utilizing advanced NLP techniques, the research preprocesses textual data from radiological reports to extract key features, which are then merged with image-derived data. This improved dataset is analyzed with Random Forest classifiers to predict specific clinical results, focusing on the identification of health issues and the estimation of case urgency. The findings reveal that the combination of NLP, LLMs, and machine learning not only increases diagnostic precision but also reliability, especially in quickly identifying critical conditions. Achieving an accuracy of 99.35%, the model shows significant advancements over conventional diagnostic techniques. The results emphasize the large potential of machine learning in medical imaging, suggesting that these technologies could greatly enhance clinician judgment and patient outcomes by offering quicker and more precise diagnostic approximations.

Keywords: natural language processing (NLP), large language models (LLMs), random forest classifier, chest x-ray analysis, medical imaging, diagnostic accuracy, indiana university dataset, machine learning in healthcare, predictive modeling, clinical decision support systems

Procedia PDF Downloads 52
16681 Optimization of Fenton Process for the Treatment of Young Municipal Leachate

Authors: Bouchra Wassate, Younes Karhat, Khadija El Falaki

Abstract:

Leachate is a source of surface water and groundwater contamination if it has not been pretreated. Indeed, due to its complex structure and its pollution load make its treatment extremely difficult to achieve the standard limits required. The objective of this work is to show the interest of advanced oxidation processes on leachate treatment of urban waste containing high concentrations of organic pollutants. The efficiency of Fenton (Fe2+ +H2O2 + H+) reagent for young leachate recovered from collection trucks household waste in the city of Casablanca, Morocco, was evaluated with the objectives of chemical oxygen demand (COD) and discoloration reductions. The optimization of certain physicochemical parameters (initial pH value, reaction time, and [Fe2+], [H2O2]/ [Fe2+] ratio) has yielded good results in terms of reduction of COD and discoloration of the leachate.

Keywords: COD removal, color removal, Fenton process, oxidation process, leachate

Procedia PDF Downloads 289
16680 Relationship between ISO 14001 and Market Performance of Firms in China: An Institutional and Market Learning Perspective

Authors: Hammad Riaz, Abubakr Saeed

Abstract:

Environmental Management System (EMS), i.e., ISO 14001 helps to build corporate reputation, legitimacy and can also be considered as firms’ strategic response to institutional pressure to reduce the impact of business activity on natural environment. The financial outcomes of certifying with ISO 14001 are still unclear and equivocal. Drawing on institutional and market learning theories, the impact of ISO 14001 on firms’ market performance is examined for Chinese firms. By employing rigorous event study approach, this paper compared ISO 14001 certified firms with non-certified counterpart firms based on different matching criteria that include size, return on assets and industry. The results indicate that the ISO 14001 has been negatively signed by the investors both in the short and long-run. This paper suggested implications for policy makers, managers, and other nonprofit organizations.

Keywords: ISO 14001, legitimacy, institutional forces, event study approach, emerging markets

Procedia PDF Downloads 167
16679 Identifying the Hidden Curriculum Components in the Nursing Education

Authors: Alice Khachian, Shoaleh Bigdeli, Azita Shoghie, Leili Borimnejad

Abstract:

Background and aim: The hidden curriculum is crucial in nursing education and can determine professionalism and professional competence. It has a significant effect on their moral performance in relation to patients. The present study was conducted with the aim of identifying the hidden curriculum components in the nursing and midwifery faculty. Methodology: The ethnographic study was conducted over two years using the Spradley method in one of the nursing schools located in Tehran. In this focused ethnographic research, the approach of Lincoln and Goba, i.e., transferability, confirmability, and dependability, was used. To increase the validity of the data, they were collected from different sources, such as participatory observation, formal and informal interviews, and document review. Two hundred days of participatory observation, fifty informal interviews, and fifteen formal interviews from the maximum opportunities and conditions available to obtain multiple and multilateral information added to the validity of the data. Due to the situation of COVID, some interviews were conducted virtually, and the activity of professors and students in the virtual space was also monitored. Findings: The components of the hidden curriculum of the faculty are: the atmosphere (physical environment, organizational structure, rules and regulations, hospital environment), the interaction between activists, and teaching-learning activities, which ultimately lead to “A disconnection between goals, speech, behavior, and result” had revealed. Conclusion: The mutual effects of the atmosphere and various actors and activities on the process of student development, since the students have the most contact with their peers first, which leads to the most learning, and secondly with the teachers. Clinicians who have close and person-to-person contact with students can have very important effects on students. Students who meet capable and satisfied professors on their way become interested in their field and hope for their future by following the mentor of these professors. On the other hand, weak and dissatisfied professors lead students to feel abandoned, and by forming a colony of peers with different backgrounds, they distort the personality of a group of students and move away from family values, which necessitates a change in some cultural practices at the faculty level.

Keywords: hidden curriculum, nursing education, ethnography, nursing

Procedia PDF Downloads 112
16678 Monitoring and Evaluation in Community-Based Tourism: An Analysis and Model

Authors: Ivan Gunass Govender, Andrea Giampiccoli

Abstract:

A developmental state should use community engagement to facilitate socio-economic development for disadvantaged groups and individual members of society through empowerment, social justice, sustainability, and self-reliance. In this regard, community-based tourism (CBT) as a growing market should be an indigenous effort aided by external facilitation. Since this form of tourism presents its own preconditions, characteristics, and challenges, it could be guided by higher education institutions engagement. In particular, the facilitation should not only serve to assist the community members to reach their own goals; but rather also focus on learning through knowledge creation and sharing with the engagement of higher education institutions. While the increased relevance of CBT has produced various CBT manuals (or handbooks/guidelines) documents aimed to ‘teach’ and assist various entities in CBT development, this research aims to analyse the current monitoring & evaluation (M&E) manuals and thereafter, propose an M&E model for CBT. It is important to mention that all too often effective monitoring is seldom carried out thus risking the long-term sustainability and improvement of the CBT ventures. Therefore, the proposed model will also consider some inputs external to the tourism field, but in relation to local economic development (LED) matters from the previously proposed development monitoring and evaluation system framework. M&E should be seen as fundamental components of any CBT initiative, and the whole CBT intervention should be evaluated. In this context, M&E in CBT should go beyond strict ‘numerical’ economic matters and should be understood in a holistic development. In addition, M&E in CBT should not consider issues in various ‘compartments’ such as tourists, tourism attractions, CBT owners/participants, and stakeholder engagement but as interdependent components of a macro-ecosystem. Finally, the external facilitation process should be structured in a way to promote community self-reliance in both the intervention and the M&E process. The research will attempt to propose an M&E model for CBT so as to enhance the CBT possibilities of long-term growth and success through effective collaborations with key stakeholders.

Keywords: community-based tourism, community-engagement, monitoring and evaluation, stakeholders

Procedia PDF Downloads 309
16677 Downscaling Seasonal Sea Surface Temperature Forecasts over the Mediterranean Sea Using Deep Learning

Authors: Redouane Larbi Boufeniza, Jing-Jia Luo

Abstract:

This study assesses the suitability of deep learning (DL) for downscaling sea surface temperature (SST) over the Mediterranean Sea in the context of seasonal forecasting. We design a set of experiments that compare different DL configurations and deploy the best-performing architecture to downscale one-month lead forecasts of June–September (JJAS) SST from the Nanjing University of Information Science and Technology Climate Forecast System version 1.0 (NUIST-CFS1.0) for the period of 1982–2020. We have also introduced predictors over a larger area to include information about the main large-scale circulations that drive SST over the Mediterranean Sea region, which improves the downscaling results. Finally, we validate the raw model and downscaled forecasts in terms of both deterministic and probabilistic verification metrics, as well as their ability to reproduce the observed precipitation extreme and spell indicator indices. The results showed that the convolutional neural network (CNN)-based downscaling consistently improves the raw model forecasts, with lower bias and more accurate representations of the observed mean and extreme SST spatial patterns. Besides, the CNN-based downscaling yields a much more accurate forecast of extreme SST and spell indicators and reduces the significant relevant biases exhibited by the raw model predictions. Moreover, our results show that the CNN-based downscaling yields better skill scores than the raw model forecasts over most portions of the Mediterranean Sea. The results demonstrate the potential usefulness of CNN in downscaling seasonal SST predictions over the Mediterranean Sea, particularly in providing improved forecast products.

Keywords: Mediterranean Sea, sea surface temperature, seasonal forecasting, downscaling, deep learning

Procedia PDF Downloads 82