Search results for: quality of life index
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17613

Search results for: quality of life index

3123 A Comparative Analysis of Classification Models with Wrapper-Based Feature Selection for Predicting Student Academic Performance

Authors: Abdullah Al Farwan, Ya Zhang

Abstract:

In today’s educational arena, it is critical to understand educational data and be able to evaluate important aspects, particularly data on student achievement. Educational Data Mining (EDM) is a research area that focusing on uncovering patterns and information in data from educational institutions. Teachers, if they are able to predict their students' class performance, can use this information to improve their teaching abilities. It has evolved into valuable knowledge that can be used for a wide range of objectives; for example, a strategic plan can be used to generate high-quality education. Based on previous data, this paper recommends employing data mining techniques to forecast students' final grades. In this study, five data mining methods, Decision Tree, JRip, Naive Bayes, Multi-layer Perceptron, and Random Forest with wrapper feature selection, were used on two datasets relating to Portuguese language and mathematics classes lessons. The results showed the effectiveness of using data mining learning methodologies in predicting student academic success. The classification accuracy achieved with selected algorithms lies in the range of 80-94%. Among all the selected classification algorithms, the lowest accuracy is achieved by the Multi-layer Perceptron algorithm, which is close to 70.45%, and the highest accuracy is achieved by the Random Forest algorithm, which is close to 94.10%. This proposed work can assist educational administrators to identify poor performing students at an early stage and perhaps implement motivational interventions to improve their academic success and prevent educational dropout.

Keywords: classification algorithms, decision tree, feature selection, multi-layer perceptron, Naïve Bayes, random forest, students’ academic performance

Procedia PDF Downloads 154
3122 Evaluation of Random Forest and Support Vector Machine Classification Performance for the Prediction of Early Multiple Sclerosis from Resting State FMRI Connectivity Data

Authors: V. Saccà, A. Sarica, F. Novellino, S. Barone, T. Tallarico, E. Filippelli, A. Granata, P. Valentino, A. Quattrone

Abstract:

The work aim was to evaluate how well Random Forest (RF) and Support Vector Machine (SVM) algorithms could support the early diagnosis of Multiple Sclerosis (MS) from resting-state functional connectivity data. In particular, we wanted to explore the ability in distinguishing between controls and patients of mean signals extracted from ICA components corresponding to 15 well-known networks. Eighteen patients with early-MS (mean-age 37.42±8.11, 9 females) were recruited according to McDonald and Polman, and matched for demographic variables with 19 healthy controls (mean-age 37.55±14.76, 10 females). MRI was acquired by a 3T scanner with 8-channel head coil: (a)whole-brain T1-weighted; (b)conventional T2-weighted; (c)resting-state functional MRI (rsFMRI), 200 volumes. Estimated total lesion load (ml) and number of lesions were calculated using LST-toolbox from the corrected T1 and FLAIR. All rsFMRIs were pre-processed using tools from the FMRIB's Software Library as follows: (1) discarding of the first 5 volumes to remove T1 equilibrium effects, (2) skull-stripping of images, (3) motion and slice-time correction, (4) denoising with high-pass temporal filter (128s), (5) spatial smoothing with a Gaussian kernel of FWHM 8mm. No statistical significant differences (t-test, p < 0.05) were found between the two groups in the mean Euclidian distance and the mean Euler angle. WM and CSF signal together with 6 motion parameters were regressed out from the time series. We applied an independent component analysis (ICA) with the GIFT-toolbox using the Infomax approach with number of components=21. Fifteen mean components were visually identified by two experts. The resulting z-score maps were thresholded and binarized to extract the mean signal of the 15 networks for each subject. Statistical and machine learning analysis were then conducted on this dataset composed of 37 rows (subjects) and 15 features (mean signal in the network) with R language. The dataset was randomly splitted into training (75%) and test sets and two different classifiers were trained: RF and RBF-SVM. We used the intrinsic feature selection of RF, based on the Gini index, and recursive feature elimination (rfe) for the SVM, to obtain a rank of the most predictive variables. Thus, we built two new classifiers only on the most important features and we evaluated the accuracies (with and without feature selection) on test-set. The classifiers, trained on all the features, showed very poor accuracies on training (RF:58.62%, SVM:65.52%) and test sets (RF:62.5%, SVM:50%). Interestingly, when feature selection by RF and rfe-SVM were performed, the most important variable was the sensori-motor network I in both cases. Indeed, with only this network, RF and SVM classifiers reached an accuracy of 87.5% on test-set. More interestingly, the only misclassified patient resulted to have the lowest value of lesion volume. We showed that, with two different classification algorithms and feature selection approaches, the best discriminant network between controls and early MS, was the sensori-motor I. Similar importance values were obtained for the sensori-motor II, cerebellum and working memory networks. These findings, in according to the early manifestation of motor/sensorial deficits in MS, could represent an encouraging step toward the translation to the clinical diagnosis and prognosis.

Keywords: feature selection, machine learning, multiple sclerosis, random forest, support vector machine

Procedia PDF Downloads 228
3121 Leveraging on Application of Customer Relationship Management Strategy as Business Driving Force: A Case Study of Major Industries

Authors: Odunayo S. Faluse, Roger Telfer

Abstract:

Customer relationship management is a business strategy that is centred on the idea that ‘Customer is the driving force of any business’ i.e. Customer is placed in a central position in any business. However, this belief coupled with the advancement in information technology in the past twenty years has experienced a change. In any form of business today it can be concluded that customers are the modern dictators to whom the industry always adjusts its business operations due to the increase in availability of information, intense market competition and ever growing negotiating ideas of customers in the process of buying and selling. The most vital role of any organization is to satisfy or meet customer’s needs and demands, which eventually determines customer’s long-term value to the industry. Therefore, this paper analyses and describes the application of customer relationship management operational strategies in some of the major industries in business. Both developed and up-coming companies nowadays value the quality of customer services and client’s loyalty, they also recognize the customers that are not very sensitive when it comes to changes in price and thereby realize that attracting new customers is more tasking and expensive than retaining the existing customers. However, research shows that several factors have recently amounts to the sudden rise in the execution of CRM strategies in the marketplace, such as a diverted attention of some organization towards integrating ideas in retaining existing customers rather than attracting new one, gathering data about customers through the use of internal database system and acquiring of external syndicate data, also exponential increase in technological intelligence. Apparently, with this development in business operations, CRM research in Academia remain nascent; hence this paper gives detailed critical analysis of the recent advancement in the use of CRM and key research opportunities for future development in using the implementation of CRM as a determinant factor for successful business optimization.

Keywords: agriculture, banking, business strategies, CRM, education, healthcare

Procedia PDF Downloads 211
3120 Determination of Phenolic Compounds in Apples Grown in Different Geographical Regions

Authors: Mindaugas Liaudanskas, Monika Tallat-Kelpsaite, Darius Kviklys, Jonas Viskelis, Pranas Viskelis, Norbertas Uselis, Juozas Lanauskas, Valdimaras Janulis

Abstract:

Apples are an important source of various biologically active compounds used for human health. Phenolic compounds detected in apples are natural antioxidants and have antimicrobial, anti-inflammatory, anticarcinogenic, and cardiovascular protective activity. The quantitative composition of phenolic compounds in apples may be affected by various factors. It is important to investigate it in order to provide the consumer with high-quality well-known composition apples and products made out of it. The objective of this study was to evaluate phenolic compounds quantitative composition in apple fruits grown in a different geographical region. In this study, biological replicates of apple cv. 'Ligol', grown in Lithuania, Latvia, Poland, and Estonia, were investigated. Three biological replicates were analyzed; one of each contained 10 apples. Samples of lyophilized apple fruits were extracted with 70% ethanol (v/v) for 20 min at 40∘C temperature using the ultrasonic bath. The ethanol extracts of apple fruits were analyzed by the high-performance liquid chromatography method. The study found that the geographical location of apple-trees had an impact on the composition of phenolic compounds in apples. The number of quercetin glycosides varied from 314.78±9.47 µg/g (Poland) to 648.17±5.61 µg/g (Estonia). The same trend was also observed with flavan-3-ols (from 829.56±47.17 µg/g to 2300.85±35.49 µg/g), phloridzin (from 55.29±1.7 µg/g to 208.78±0.35 µg/g), and chlorogenic acid (from 501.39±28.84 µg/g to 1704.35±22.65 µg/g). It was observed that the amount of investigated phenolic compounds tended to increase from apples grown in the southern location (Poland) (1701.02±75.38 µg/g) to apples grown northern location (Estonia) (4862.15±56.37 µg/g). Apples (cv. 'Ligol') grown in Estonia accumulated approx. 2.86 times higher amount of phenolic compounds than apples grown in Poland. Acknowledgment: This work was supported by a grant from the Research Council of Lithuania, project No. S-MIP-17-8.

Keywords: apples, cultivar 'Ligol', geographical regions, HPLC, phenolic compounds

Procedia PDF Downloads 172
3119 Strategies for Tackling Climate Change: Review of Sustainability and Air-Conditioning

Authors: Tosin T. Oye, Keng Goh, Naren Gupta, Toyosi K. Oye

Abstract:

One of the most extreme difficulties confronting humankind in the twenty-first century is the consumption of energy. Non-renewable energy sources have been the fundamental energy assets for human culture. The consumption of energy sources emanating from the use of air-conditioning is still causing and has caused harm to the environment and human health. The request for energy could be double or perhaps triple in the future because of the utilization of air-conditioning systems as the worldwide population develops and emerging districts grow their economics. This has recently raised worries in sustainable development over climate change, global warming, ozone layer reduction, health issues, and possible supply problems. As a result of the improvement of way of life, air-conditioning has generally been applied. Nevertheless, environmental pollutions and health issues related with the use of air-conditioning unfolds more as often as possible. In order to diminish their level of undesirable impact on the environment, it is essential to establish suitable strategies for tackling climate change. Therefore, this paper aims to review and analyze studies in sustainability and air- conditioning and subsequently suggest strategies for combatting climate change. Future perspectives for tackling climate change are likewise suggested. The key findings revealed that it is required to establish sustainability measures to reduce the level of energy consumption and carbon emissions in a bid to effectively tackle climate change and its impact on the environment, and then raise public alertness towards the adverse impact of climate change arising from the use of air-conditioning systems. The research outcome offers valuable awareness to the general public, organizations, policymakers, and the government in making future municipal zones sustainable and more climate resilient.

Keywords: air-conditioning, climate change, environment, human health, sustainability

Procedia PDF Downloads 114
3118 Using the Weakest Precondition to Achieve Self-Stabilization in Critical Networks

Authors: Antonio Pizzarello, Oris Friesen

Abstract:

Networks, such as the electric power grid, must demonstrate exemplary performance and integrity. Integrity depends on the quality of both the system design model and the deployed software. Integrity of the deployed software is key, for both the original versions and the many that occur throughout numerous maintenance activity. Current software engineering technology and practice do not produce adequate integrity. Distributed systems utilize networks where each node is an independent computer system. The connections between them is realized via a network that is normally redundantly connected to guarantee the presence of a path between two nodes in the case of failure of some branch. Furthermore, at each node, there is software which may fail. Self-stabilizing protocols are usually present that recognize failure in the network and perform a repair action that will bring the node back to a correct state. These protocols first introduced by E. W. Dijkstra are currently present in almost all Ethernets. Super stabilization protocols capable of reacting to a change in the network topology due to the removal or addition of a branch in the network are less common but are theoretically defined and available. This paper describes how to use the Software Integrity Assessment (SIA) methodology to analyze self-stabilizing software. SIA is based on the UNITY formalism for parallel and distributed programming, which allows the analysis of code for verifying the progress property p leads-to q that describes the progress of all computations starting in a state satisfying p to a state satisfying q via the execution of one or more system modules. As opposed to demonstrably inadequate test and evaluation methods SIA allows the analysis and verification of any network self-stabilizing software as well as any other software that is designed to recover from failure without external intervention of maintenance personnel. The model to be analyzed is obtained by automatic translation of the system code to a transition system that is based on the use of the weakest precondition.

Keywords: network, power grid, self-stabilization, software integrity assessment, UNITY, weakest precondition

Procedia PDF Downloads 214
3117 Trends in Use of Millings in Pavement Maintenance

Authors: Rafiqul Tarefder, Mohiuddin Ahmad, Mohammad Hossain

Abstract:

While milling materials from old pavement surface can be an important component of cost effective maintenance operation, their use in maintenance projects are not uniform and well documented. This study documents the different maintenance practices followed by four transportation districts of New Mexico Department of Transportation (NMDOT) in an attempt to find whether millings are being used in maintenance projects by those districts. Based on existing literature, a questionnaire was developed related to six common maintenance practices. NMDOT district personal were interviewed face to face to discuss and get answers to that questionnaire. It revealed that NMDOT districts mainly use chip seal and patching. Other maintenance procedures such as sand seal, scrub seal, slurry seal, and thin overlay have limited use. Two out of four participating districts do not have any documents on chip sealing; rather they employ the experiences of the chip seal crew. All districts use polymer modified high float emulsion (HFE100P) for chip seal with an application rate ranging from 0.4 to 0.56 gallons per square yard. Chip application rate varies from 15 to 40 lb/ square yard. State wide, the thickness of chip seal varies from 3/8" to 1" and life varies from 3 to 10 years. NMDOT districts mainly use three type of patching: pothole, dig-out and blade patch. Pothole patches are used for small potholes and during emergency, dig-out patches are used for all type of potholes sometimes after pothole patching, and blade patch is used when a significant portion of the pavement is damaged. Pothole patches last as low as three days whereas, blade patch lasts as long as 3 years. It was observed that all participating districts use millings in maintenance projects.

Keywords: chip seal, sand seal, scrub seal, slurry seal, overlay, patching, millings

Procedia PDF Downloads 332
3116 The Impact of Board Characteristics on Firm Performance: Evidence from Banking Industry in India

Authors: Manmeet Kaur, Madhu Vij

Abstract:

The Board of Directors in a firm performs the primary role of an internal control mechanism. This Study seeks to understand the relationship between internal governance and performance of banks in India. The research paper investigates the effect of board structure (proportion of nonexecutive directors, gender diversity, board size and meetings per year) on the firm performance. This paper evaluates the impact of corporate governance mechanisms on bank’s financial performance using panel data for 28 listed banks in National Stock Exchange of India for the period of 2008-2014. Returns on Asset, Return on Equity, Tobin’s Q and Net Interest Margin were used as the financial performance indicators. To estimate the relationship among governance and bank performance initially the Study uses Pooled Ordinary Least Square (OLS) Estimation and Generalized Least Square (GLS) Estimation. Then a well-developed panel Generalized Method of Moments (GMM) Estimator is developed to investigate the dynamic nature of performance and governance relationship. The Study empirically confirms that two-step system GMM approach controls the problem of unobserved heterogeneity and endogeneity as compared to the OLS and GLS approach. The result suggests that banks with small board, boards with female members, and boards that meet more frequently tend to be more efficient and subsequently have a positive impact on performance of banks. The study offers insights to policy makers interested in enhancing the quality of governance of banks in India. Also, the findings suggest that board structure plays a vital role in the improvement of corporate governance mechanism for financial institutions. There is a need to have efficient boards in banks to improve the overall health of the financial institutions and the economic development of the country.

Keywords: board of directors, corporate governance, GMM estimation, Indian banking

Procedia PDF Downloads 246
3115 Outcomes of Using Guidelines for Caring and Referring ST Elevation Myocardial Infarction (STEMI) Patients at the Accident and Emergency Department of Songkhla Hospital, Thailand

Authors: Thanom Kaeniam

Abstract:

ST-Elevation Myocardial Infarction (STEMI) is a state of sudden death of the heart muscle due to sudden blockage of the artery. STEMI patients are usually in critical condition and with a potential opportunity for sudden death. Therefore, management guidelines for safety in caring and referring STEMI patients are needed. The objective of this developmental research was to assess the effectiveness of using the guidelines for caring and referring STEMI patients at the Accident and Emergency Department of Songkhla Hospital. The subjects of the study were 22 nurses in the emergency room, and doctors on duty in the accident and emergency room selected using purposive sampling with inclusion criteria. The research instruments were the guidelines for caring and referring STEMI patients, and record forms for the effectiveness of using the guidelines for caring and referring STEMI patients (a general record form for STEMI patients, a record form for SK administering, a referring record form for PCI, and a record form for dead patient in the accident and emergency room and during referring). The instruments were tested for content validity by three experts, and the reliability was tested using Kuder-Richardson 20 (KR20). The descriptive statistic employed was the percentage. The outcomes of using the guidelines for caring and referring ST Elevation Myocardial Infarction (STEMI) Patients at the Accident and Emergency Department revealed that before using the guidelines in 2009, 2010, and 2011, there were 84, 73, and 138 STEMI patients receiving services at the accident and emergency room, of which, only 9, 32, and 48 patients were referred for PCI/SK medications, or 10.74; 43.84; and 34.78 percent, and the death rates were 10.71; 10.95; and 11.59 percent, respectively. However, after the use of the guidelines in 2012, 2013, and 2014, there were 97, 77, and 57 patients, of which, the increases to 77, 72, and 55 patients were referred for PCI /SK medications or 79.37; 93.51; and 96.49 percent, and the death rates were reduced to 10.30; 6.49; and 1.76 percent, respectively. The results of the study revealed that the use of the guidelines for caring and referring STEMI patients at the Accident and Emergency Department increased the effectiveness and quality of nursing, especially in terms of SK medication, caring and referring patients for PCI to reduce the death rate.

Keywords: outcomes, guidelines for caring, referring, myocardial infarction, STEMI

Procedia PDF Downloads 382
3114 Thrombocytopenia and Prolonged Prothrombin Time in Neonatal Septicemia

Authors: Shittu Bashirat, Shittu Mujeeb, Oluremi Adeolu, Orisadare Olayiwola, Jikeme Osameke, Bello Lateef

Abstract:

Septicemia in neonates refers to generalized bacterial infection documented by positive blood culture in the first 28 days of life and is one of the leading causes of neonatal mortality in sub-Sahara Africa. Thrombocytopenia in newborns is a result of increased platelet consumption; sepsis was found to be the most common risk factor. The objective of the study was to determine if there are organism-specific platelet responses among the 2 groups of bacterial agents: Gram-positive and Gram-negative bacteria, and also to examine the association of platelet count and prothrombin time with neonatal septicemia. 232 blood samples were collected for this study. The blood culture was performed using Bactec 9050, an instrumented blood culture system. The platelet count and prothrombin time were performed using Abacus Junior 5 hematology analyzer and i-STAT 1 analyzer respectively. Of the 231 neonates hospitalized with clinical sepsis, blood culture reports were positive in 51 cases (21.4%). Klebsiella spp. (35.3%) and Staphylococcus aureus (27.5%) were the most common Gram-negative and Gram-positive isolates respectively. Thrombocytopenia was observed in 30 (58.8%) of the neonates with septicemia. Of the 9 (17.6%) patients with severe thrombocytopenia, seven (77.8%) had Klebsiella spp. septicemia. Out of the 21(63.6%) of thrombocytopenia produced by Gram-negative isolate, 17 (80.9) had increased prothrombin time. In conclusion, Gram-negative organisms showed the highest cases of severe thrombocytopenia and prolonged PT. This study has helped to establish a disturbance in hemostatic systems in neonates with septicemia. Further studies, however, may be required to assess other hemostasis parameters in order to understand their interaction with the infectious organisms in neonates.

Keywords: neonates, septicemia, thrombocytopenia, prolonged prothrombin time, platelet count

Procedia PDF Downloads 392
3113 Collaboration During Planning and Reviewing in Writing: Effects on L2 Writing

Authors: Amal Sellami, Ahlem Ammar

Abstract:

Writing is acknowledged to be a cognitively demanding and complex task. Indeed, the writing process is composed of three iterative sub-processes, namely planning, translating (writing), and reviewing. Not only do second or foreign language learners need to write according to this process, but they also need to respect the norms and rules of language and writing in the text to-be-produced. Accordingly, researchers have suggested to approach writing as a collaborative task in order to al leviate its complexity. Consequently, collaboration has been implemented during the whole writing process or only during planning orreviewing. Researchers report that implementing collaboration during the whole process might be demanding in terms of time in comparison to individual writing tasks. Consequently, because of time constraints, teachers may avoid it. For this reason, it might be pedagogically more realistic to limit collaboration to one of the writing sub-processes(i.e., planning or reviewing). However, previous research implementing collaboration in planning or reviewing is limited and fails to explore the effects of the seconditionson the written text. Consequently, the present study examines the effects of collaboration in planning and collaboration in reviewing on the written text. To reach this objective, quantitative as well as qualitative methods are deployed to examine the written texts holistically and in terms of fluency, complexity, and accuracy. Participants of the study include 4 pairs in each group (n=8). They participated in two experimental conditions, which are: (1) collaborative planning followed by individual writing and individual reviewing and (2) individual planning followed by individual writing and collaborative reviewing. The comparative research findings indicate that while collaborative planning resulted in better overall text quality (precisely better content and organization ratings), better fluency, better complexity, and fewer lexical errors, collaborative reviewing produces better accuracy and less syntactical and mechanical errors. The discussion of the findings suggests the need to conduct more comparative research in order to further explore the effects of collaboration in planning or in reviewing. Pedagogical implications of the current study include advising teachers to choose between implementing collaboration in planning or in reviewing depending on their students’ need and what they need to improve.

Keywords: collaboration, writing, collaborative planning, collaborative reviewing

Procedia PDF Downloads 86
3112 Experimental Architectural Pedagogy: Discipline Space and Its Role in the Modern Teaching Identity

Authors: Matthew Armitt

Abstract:

The revolutionary school of architectural teaching – VKhUTEAMAS (1923-1926) was a new approach for a new society bringing architectural education to the masses and masses to the growing industrial production. The school's pedagogical contribution of the 1920s made it an important school of the modernist movement, engaging pedagogy as a mode of experimentation. The teachers and students saw design education not just as a process of knowledge transfer but as a vehicle for design innovation developing an approach without precedent. This process of teaching and learning served as a vehicle for venturing into the unknown through a discipline of architectural teaching called “Space” developed by the Soviet architect Nikolai Ladovskii (1881-1941). The creation of “Space” was paramount not only for its innovative pedagogy but also as an experimental laboratory for developing new architectural language. This paper discusses whether the historical teaching of “Space” can function in the construction of the modern teaching identity today to promote value, richness, quality, and diversity inherent in architectural design education. The history of “Space” teaching remains unknown within academic circles and separate from the current architectural teaching debate. Using VKhUTEMAS and the teaching of “Space” as a pedagogical lens and drawing upon research carried out in the Russian Federation, America, Canada, Germany, and the UK, this paper discusses how historically different models of teaching and learning can intersect through examining historical based educational research by exploring different design studio initiatives; pedagogical methodologies; teaching and learning theories and problem-based projects. There are strong arguments and desire for pedagogical change and this paper will promote new historical and educational research to widen the current academic debate by exposing new approaches to architectural teaching today.

Keywords: VKhUTEMAS, discipline space, modernist pedagogy, teaching identity

Procedia PDF Downloads 115
3111 Reliability Analysis of Variable Stiffness Composite Laminate Structures

Authors: A. Sohouli, A. Suleman

Abstract:

This study focuses on reliability analysis of variable stiffness composite laminate structures to investigate the potential structural improvement compared to conventional (straight fibers) composite laminate structures. A computational framework was developed which it consists of a deterministic design step and reliability analysis. The optimization part is Discrete Material Optimization (DMO) and the reliability of the structure is computed by Monte Carlo Simulation (MCS) after using Stochastic Response Surface Method (SRSM). The design driver in deterministic optimization is the maximum stiffness, while optimization method concerns certain manufacturing constraints to attain industrial relevance. These manufacturing constraints are the change of orientation between adjacent patches cannot be too large and the maximum number of successive plies of a particular fiber orientation should not be too high. Variable stiffness composites may be manufactured by Automated Fiber Machines (AFP) which provides consistent quality with good production rates. However, laps and gaps are the most important challenges to steer fibers that effect on the performance of the structures. In this study, the optimal curved fiber paths at each layer of composites are designed in the first step by DMO, and then the reliability analysis is applied to investigate the sensitivity of the structure with different standard deviations compared to the straight fiber angle composites. The random variables are material properties and loads on the structures. The results show that the variable stiffness composite laminate structures are much more reliable, even for high standard deviation of material properties, than the conventional composite laminate structures. The reason is that the variable stiffness composite laminates allow tailoring stiffness and provide the possibility of adjusting stress and strain distribution favorably in the structures.

Keywords: material optimization, Monte Carlo simulation, reliability analysis, response surface method, variable stiffness composite structures

Procedia PDF Downloads 505
3110 The Integration Challenges of Women Refugees in Sweden from Socio-Cultural Perspective

Authors: Khadijah Saeed Khan

Abstract:

One of the major current societal issues of Swedish society is to integrate newcomer refugees well into the host society. The cultural integration issue is one of the under debated topic in the literature, and this study intends to meet this gap from the Swedish perspective. The purpose of this study is to explore the role and types of cultural landscapes of refugee women in Sweden and how these landscapes help or hinder the settlement process. The cultural landscapes are referred to as a set of multiple cultural activities or practices which refugees perform in a specific context and circumstances (i.e., being in a new country) to seek, share or use relevant information for their settlement. Information plays a vital role in various aspects of newcomers' lives in a new country. This article has an intention to highlight the importance of multiple cultural landscapes as a source of information (regarding employment, language learning, finding accommodation, immigration matters, health concerns, school and education, family matters, and other everyday matters) for refugees to settle down in Sweden. Some relevant theories, such as information landscapes and socio-cultural theories, are considered in this study. A qualitative research design is employed, including semi-structured deep interviews and participatory observation with 20 participants. The initial findings show that the refugee women encounter many information-related and integration-related challenges in Sweden and have built a network of cultural landscapes in which they practice various co-ethnic cultural and religious activities at different times of the year. These landscapes help them to build a sense of belonging with people from their own or similar land and assist them to seek and share relevant information in everyday life in Sweden.

Keywords: cultural integration, cultural landscapes, information, women refugees

Procedia PDF Downloads 130
3109 Media Coverage of Cervical Cancer in Malawi: A National Sample of Newspapers and a Radio Station

Authors: Elida Tafupenji Kamanga

Abstract:

Cancer of the cervix remains one of the high causes of death among Malawian women. Despite the government introduction of free screening services throughout the country, patronage still remains low and lack of knowledge high. Given the critical role mass media plays in relaying different information to the public including health and its influence on health behaviours, the study sought to analyse Malawi media coverage of the disease and its effectiveness. The findings of the study will help inform media advocacy directed at changing any coverage impeding the effective dissemination of cervical cancer message which consequently will help increase awareness and accessing of screening behaviours among women. A content analysis of 29 newspapers and promotional messages on cervical from a local radio station was conducted for the period from 2012 to 2015. Overall the results showed media coverage in terms of content and frequency increased for the four-year period. However, of concern was the quality of information both media presented to the public. The lapse in information provided means there is little education taking place through the media which could be contributing to the knowledge gap the women have thereby affecting their decision to screen. Also lack of adequate funding to media institutions and lack of collaboration between media institutions and stakeholders involved in the fight against the disease were noted as other contributing factors to low coverage of the disease. Designing messages that are not only informative and educative but also innovative may help increase awareness; improve the knowledge gap and potential adoption of preventive screening behaviour by Malawian women. Conversely, good communication between the media institutions and researchers involved in the fight against the disease through the channelling of new findings back to the public as well as increasing funding towards similar cause should be considered.

Keywords: cervical cancer, effectiveness, media coverage, screening

Procedia PDF Downloads 186
3108 Detection and Quantification of Ochratoxin A in Food by Aptasensor

Authors: Moez Elsaadani, Noel Durand, Brice Sorli, Didier Montet

Abstract:

Governments and international instances are trying to improve the food safety system to prevent, reduce or avoid the increase of food borne diseases. This food risk is one of the major concerns for the humanity. The contamination by mycotoxins is a threat to the health and life of humans and animals. One of the most common mycotoxin contaminating feed and foodstuffs is Ochratoxin A (OTA), which is a secondary metabolite, produced by Aspergillus and Penicillium strains. OTA has a chronic toxic effect and proved to be mutagenic, nephrotoxic, teratogenic, immunosuppressive, and carcinogenic. On the other side, because of their high stability, specificity, affinity, and their easy chemical synthesis, aptamer based methods are applied to OTA biosensing as alternative to traditional analytical technique. In this work, five aptamers have been tested to confirm qualitatively and quantitatively their binding with OTA. In the same time, three different analytical methods were tested and compared based on their ability to detect and quantify the OTA. The best protocol that was established to quantify free OTA from linked OTA involved an ultrafiltration method in green coffee solution with. OTA was quantified by HPLC-FLD to calculate the binding percentage of all five aptamers. One aptamer (The most effective with 87% binding with OTA) has been selected to be our biorecognition element to study its electrical response (variation of electrical properties) in the presence of OTA in order to be able to make a pairing with a radio frequency identification (RFID). This device, which is characterized by its low cost, speed, and a simple wireless information transmission, will implement the knowledge on the mycotoxins molecular sensors (aptamers), an electronic device that will link the information, the quantification and make it available to operators.

Keywords: aptamer, aptasensor, detection, Ochratoxin A

Procedia PDF Downloads 164
3107 Concept Analysis of Professionalism in Teachers and Faculty Members

Authors: Taiebe Shokri, Shahram Yazdani, Leila Afshar, Soleiman Ahmadi

Abstract:

Introduction: The importance of professionalism in higher education not only determines the appropriate and inappropriate behaviors and guides faculty members in the implementation of professional responsibilities, but also guarantees faculty members' adherence to professional principles and values, ensures the quality of teaching and facilitator will be the teaching-learning process in universities and will increase the commitment to meet the needs of students as well as the development of an ethical culture based on ethics. Therefore, considering the important role of medical education teachers to prepare teachers and students in the future, the need to determine the concept of professional teacher and teacher, and the characteristics of teacher professionalism, we have explained the concept of professionalism in teachers in this study. Methods: The concept analysis method used in this study was Walker and Avant method which has eight steps. Walker and Avant state the purpose of concept analysis as follows: The process of distinguishing between the defining features of a concept and its unrelated features. The process of concept analysis includes selecting a concept, determining the purpose of the analysis, identifying the uses of the concept, determining the defining features of the concept, identifying a model, identifying boundary and adversarial items, identifying the precedents and consequences of the concept, and defining empirical references. is. Results: Professionalism in its general sense, requires deep knowledge, insight, creating a healthy and safe environment, honesty and trust, impartiality, commitment to the profession and continuous improvement, punctuality, criticism, professional competence, responsibility, and Individual accountability, especially in social interactions, is an effort for continuous improvement, the acquisition of these characteristics is not easily possible and requires education, especially continuous learning. Professionalism is a set of values, behaviors, and relationships that underpin public trust in teachers.

Keywords: concept analysis, medical education, professionalism, faculty members

Procedia PDF Downloads 143
3106 Novel Bioinspired Design to Capture Smoky CO2 by Reactive Absorption with Aqueous Scrubber

Authors: J. E. O. Hernandez

Abstract:

In the next 20 years, energy production by burning fuels will increase and so will the atmospheric concentration of CO2 and its well-known threats to life on Earth. The technologies available for capturing CO2 are still dubious and this keeps fostering an interest in bio-inspired approaches. The leading one is the application of carbonic anhydrase (CA) –a superfast biocatalyst able to convert up to one million molecules of CO2 into carbonates in water. However, natural CA underperforms when applied to real smoky CO2 in chimneys and, so far, the efforts to create superior CAs in the lab rely on screening methods running under pristine conditions at the micro level, which are far from resembling those in chimneys. For the evolution of man-made enzymes, selection rather than screening would be ideal but this is challenging because of the need for a suitable artificial environment that is also sustainable for our society. Herein we present the stepwise design and construction of a bioprocess (from bench-scale to semi-pilot) for evolutionary selection experiments. In this bioprocess, reaction and adsorption took place simultaneously at atmospheric pressure in a spray tower. The scrubbing solution was fed countercurrently by reusing municipal pressure and it was mainly prepared with water, carbonic anhydrase and calcium chloride. This bioprocess allowed for the enzymatic carbonation of smoky CO2; the reuse of process water and the recovery of solid carbonates without cooling of smoke, pretreatments, solvent amines and compression of CO2. The average yield of solid carbonates was 0.54 g min-1 or 12-fold the amount produced in serum bottles at lab bench scale. This bioprocess could be used as a tailor-made environment for driving the selection of superior CAs. The bioprocess and its match CA could be sustainably used to reduce global warming by CO2 emissions from exhausts.

Keywords: biological carbon capture and sequestration, carbonic anhydrase, directed evolution, global warming

Procedia PDF Downloads 183
3105 Creation of Computerized Benchmarks to Facilitate Preparedness for Biological Events

Authors: B. Adini, M. Oren

Abstract:

Introduction: Communicable diseases and pandemics pose a growing threat to the well-being of the global population. A vital component of protecting the public health is the creation and sustenance of a continuous preparedness for such hazards. A joint Israeli-German task force was deployed in order to develop an advanced tool for self-evaluation of emergency preparedness for variable types of biological threats. Methods: Based on a comprehensive literature review and interviews with leading content experts, an evaluation tool was developed based on quantitative and qualitative parameters and indicators. A modified Delphi process was used to achieve consensus among over 225 experts from both Germany and Israel concerning items to be included in the evaluation tool. Validity and applicability of the tool for medical institutions was examined in a series of simulation and field exercises. Results: Over 115 German and Israeli experts reviewed and examined the proposed parameters as part of the modified Delphi cycles. A consensus of over 75% of experts was attained for 183 out of 188 items. The relative importance of each parameter was rated as part of the Delphi process, in order to define its impact on the overall emergency preparedness. The parameters were integrated in computerized web-based software that enables to calculate scores of emergency preparedness for biological events. Conclusions: The parameters developed in the joint German-Israeli project serve as benchmarks that delineate actions to be implemented in order to create and maintain an ongoing preparedness for biological events. The computerized evaluation tool enables to continuously monitor the level of readiness and thus strengths and gaps can be identified and corrected appropriately. Adoption of such a tool is recommended as an integral component of quality assurance of public health and safety.

Keywords: biological events, emergency preparedness, bioterrorism, natural biological events

Procedia PDF Downloads 413
3104 Selection of Optimal Reduced Feature Sets of Brain Signal Analysis Using Heuristically Optimized Deep Autoencoder

Authors: Souvik Phadikar, Nidul Sinha, Rajdeep Ghosh

Abstract:

In brainwaves research using electroencephalogram (EEG) signals, finding the most relevant and effective feature set for identification of activities in the human brain is a big challenge till today because of the random nature of the signals. The feature extraction method is a key issue to solve this problem. Finding those features that prove to give distinctive pictures for different activities and similar for the same activities is very difficult, especially for the number of activities. The performance of a classifier accuracy depends on this quality of feature set. Further, more number of features result in high computational complexity and less number of features compromise with the lower performance. In this paper, a novel idea of the selection of optimal feature set using a heuristically optimized deep autoencoder is presented. Using various feature extraction methods, a vast number of features are extracted from the EEG signals and fed to the autoencoder deep neural network. The autoencoder encodes the input features into a small set of codes. To avoid the gradient vanish problem and normalization of the dataset, a meta-heuristic search algorithm is used to minimize the mean square error (MSE) between encoder input and decoder output. To reduce the feature set into a smaller one, 4 hidden layers are considered in the autoencoder network; hence it is called Heuristically Optimized Deep Autoencoder (HO-DAE). In this method, no features are rejected; all the features are combined into the response of responses of the hidden layer. The results reveal that higher accuracy can be achieved using optimal reduced features. The proposed HO-DAE is also compared with the regular autoencoder to test the performance of both. The performance of the proposed method is validated and compared with the other two methods recently reported in the literature, which reveals that the proposed method is far better than the other two methods in terms of classification accuracy.

Keywords: autoencoder, brainwave signal analysis, electroencephalogram, feature extraction, feature selection, optimization

Procedia PDF Downloads 103
3103 Nietzsche's 'Will to Power' as a Potentially Irrational-Rational Psychopathology: How and Why Amor Fati May Prove to Be Its 'Horse Whisperer'

Authors: Nikolai David Blaskow

Abstract:

Nietzsche's scholarship in the main has never quite resolved its deeply divided, at times self-contradictory responses to what Friedrich Nietzsche might have actually meant by his notion of the 'will to power'. Yet, in the context of the current global pandemic and climate change crisis, never has there been a more urgent need to investigate and resolve that contradiction. This paper argues for the 'will to power' as being a potentially irrational-rational psychopathology, one that can properly be understood only by means of Nietzsche's agonistic insights into another psychopathology—that of ressentiment. The argument also makes a case for the contention that amor fati (Nietzsche’s positive affirmation of life) may prove to be ressentiment's cure. In addition, as an integral part of the case’s methodology, the lens defined as the Mimetic and Scapegoat theory of Rene Girard (1923-2015) is brought to bear on resolving the contradiction. Ressentiment and Mimetic Theory will prove to be key players in the investigation, in as much as they expose the reasons for a modernity in crisis. The major finding of this study is that when the explanatory power of the two theories is applied, an understanding of the dynamics of the crisis in which we find ourselves emerges. The keys to that insight will include: (1) how these two psychopathologies closely resemble the contemporary neurologically defined 'borderline conditions' and their implications for culture (2) how identity politics stifle exemplary leadership, and so create toxic cultures (3) a critical assessment of Achille Mbembe's (2019) re-working of Frantz Fanon's 'ethics of the passerby' and its resonances with Nietzsche's amor fati.

Keywords: agon, amor fati, borderline conditions, ethics of the passer by, exemplary leadership, identity politics, mimesis, ressentiment, scapegoat mechanism

Procedia PDF Downloads 241
3102 A Constructed Wetland as a Reliable Method for Grey Wastewater Treatment in Rwanda

Authors: Hussein Bizimana, Osman Sönmez

Abstract:

Constructed wetlands are current the most widely recognized waste water treatment option, especially in developing countries where they have the potential for improving water quality and creating valuable wildlife habitat in ecosystem with treatment requirement relatively simple for operation and maintenance cost. Lack of grey waste water treatment facilities in Kigali İnstitute of Science and Technology in Rwanda, causes pollution in the surrounding localities of Rugunga sector, where already a problem of poor sanitation is found. In order to treat grey water produced at Kigali İnstitute of Science and Technology, with high BOD concentration, high nutrients concentration and high alkalinity; a Horizontal Sub-surface Flow pilot-scale constructed wetland was designed and can operate in Kigali İnstitute of Science and Technology. The study was carried out in a sedimentation tank of 5.5 m x 1.42 m x 1.2 m deep and a Horizontal Sub-surface constructed wetland of 4.5 m x 2.5 m x 1.42 m deep. The grey waste water flow rate of 2.5 m3/d flew through vegetated wetland and sandy pilot plant. The filter media consisted of 0.6 to 2 mm of coarse sand, 0.00003472 m/s of hydraulic conductivity and cattails (Typha latifolia spp) were used as plants species. The effluent flow rate of the plant is designed to be 1.5 m3/ day and the retention time will be 24 hrs. 72% to 79% of BOD, COD, and TSS removals are estimated to be achieved, while the nutrients (Nitrogen and Phosphate) removal is estimated to be in the range of 34% to 53%. Every effluent characteristic will meet exactly the Rwanda Utility Regulatory Agency guidelines primarily because the retention time allowed is enough to make the reduction of contaminants within effluent raw waste water. Treated water reuse system was developed where water will be used in the campus irrigation system again.

Keywords: constructed wetlands, hydraulic conductivity, grey waste water, cattails

Procedia PDF Downloads 592
3101 Rural Community Knowledge, Attitude and Perceptions of Consuming Dried Vegetables in Central Region of Tanzania

Authors: Radegunda Kessy, Justus Ochieng, Victor Afari-Sefa, Takemore Chagomoka, Ngoni Nenguwo

Abstract:

Vegetables are excellent sources of dietary fiber, vitamins, and minerals which constitute an indispensable constituent of diets, but in Tanzania and other Sub-Saharan African countries, they are not readily available all year round due to seasonal variations in the production cycle. Drying of vegetables is one of the traditional methods for food preservation known to man. The Dodoma and Singida regions of Tanzania are characterized by semi-arid agro-climate, thereby experiencing short seasonal supply of fresh vegetables followed by long drought in which dried vegetables become an alternative to meet high household demands. A primary survey of 244 of rural consumers was carried out to understand how knowledge, attitudes, and perceptions of rural consumers affect consumption of dried vegetables. The sample respondents were all found to be aware of open sun drying of vegetables while less than 50% of them were aware of solar-dried vegetables. Consumers were highly concerned with the hygiene, nutritional values, taste, drying method, freshness, color of dried vegetables, timely availability and easiness of cooking as important factors they consider before they purchase dried vegetables. Logit model results show that gender, income, years of consuming dried vegetables, awareness of the importance of solar dried vegetables vis-à-vis sun-dried alternatives and employment status influenced rural consumer’s decision to purchase dried vegetables. Preference on dried vegetables differs across the regions which are also important considerations for any future planned interventions. The findings imply that development partners and policymakers need to design better social marketing and promotion techniques for the enhanced adoption of solar drying technology, which will greatly improve the quality and utilization of dried vegetables by target households.

Keywords: dried vegetables, postharvest management, sun drying, solar drying

Procedia PDF Downloads 182
3100 A Bicycle Based Model of Prehospital Care Implanted in Northeast of the Brazil: Initial Experience

Authors: Odaleia de O. Farias, Suzelene C. Marinho, Ecleidson B. Fragoso, Daniel S. Lima, Francisco R. S. Lira, Lara S. Araújo, Gabriel dos S. D. Soares

Abstract:

In populous cities, prehospital care services that use vehicles alternative to ambulances are needed in order to reduce costs and improve response time to occurrences in areas with large concentration of people, such as leisure and tourism spaces. In this context, it was implanted a program called BIKE VIDA, that is innovative quick access and assistance program. The aim of this study is to describe the implantation and initial profile of occurrences performed by an urgency/emergency pre-hospital care service through paramedics on bicycles. It is a cross-sectional, descriptive study carried out in the city of Fortaleza, Ceara, Brazil. The data included service records from July to August 2017. Ethical aspects were respected. The service covers a perimeter of 4.5 km, divided into three areas with perimeter of 1.5 km for each paramedic, attending from 5 am to 9 pm. Materials transported by bicycles include External Automated Defibrillator - DEA, portable oxygen, oximeter, cervical collar, stethoscope, sphygmomanometer, dressing and immobilization materials and personal protective equipment. Occurrences are requested directly by calling the emergency number 192 or through direct approach to the professional. In the first month of the program, there were 93 emergencies/urgencies, mainly in the daytime period (71,0%), in males (59,7%), in the age range of 26 to 45 years (46,2%). The main nature was traumatic incidents (53.3%). Most of the cases (88,2%) did not require ambulance transport to the hospital, and there were two deaths. Pre-hospital service through bicycles is an innovative strategy in Brazil and has shown to be promising in terms of reducing costs and improving the quality of the services offered.

Keywords: emergency, response time, prehospital care, urgency

Procedia PDF Downloads 185
3099 Probabilistic Crash Prediction and Prevention of Vehicle Crash

Authors: Lavanya Annadi, Fahimeh Jafari

Abstract:

Transportation brings immense benefits to society, but it also has its costs. Costs include such as the cost of infrastructure, personnel and equipment, but also the loss of life and property in traffic accidents on the road, delays in travel due to traffic congestion and various indirect costs in terms of air transport. More research has been done to identify the various factors that affect road accidents, such as road infrastructure, traffic, sociodemographic characteristics, land use, and the environment. The aim of this research is to predict the probabilistic crash prediction of vehicles using machine learning due to natural and structural reasons by excluding spontaneous reasons like overspeeding etc., in the United States. These factors range from weather factors, like weather conditions, precipitation, visibility, wind speed, wind direction, temperature, pressure, and humidity to human made structures like road structure factors like bump, roundabout, no exit, turning loop, give away, etc. Probabilities are dissected into ten different classes. All the predictions are based on multiclass classification techniques, which are supervised learning. This study considers all crashes that happened in all states collected by the US government. To calculate the probability, multinomial expected value was used and assigned a classification label as the crash probability. We applied three different classification models, including multiclass Logistic Regression, Random Forest and XGBoost. The numerical results show that XGBoost achieved a 75.2% accuracy rate which indicates the part that is being played by natural and structural reasons for the crash. The paper has provided in-deep insights through exploratory data analysis.

Keywords: road safety, crash prediction, exploratory analysis, machine learning

Procedia PDF Downloads 101
3098 Risk Issues for Controlling Floods through Unsafe, Dual Purpose, Gated Dams

Authors: Gregory Michael McMahon

Abstract:

Risk management for the purposes of minimizing the damages from the operations of dams has met with opposition emerging from organisations and authorities, and their practitioners. It appears that the cause may be a misunderstanding of risk management arising from exchanges that mix deterministic thinking with risk-centric thinking and that do not separate uncertainty from reliability and accuracy from probability. This paper sets out those misunderstandings that arose from dam operations at Wivenhoe in 2011, using a comparison of outcomes that have been based on the methodology and its rules and those that have been operated by applying misunderstandings of the rules. The paper addresses the performance of one risk-centric Flood Manual for Wivenhoe Dam in achieving a risk management outcome. A mixture of engineering, administrative, and legal factors appear to have combined to reduce the outcomes from the risk approach. These are described. The findings are that a risk-centric Manual may need to assist administrations in the conduct of scenario training regimes, in responding to healthy audit reporting, and in the development of decision-support systems. The principal assistance needed from the Manual, however, is to assist engineering and the law to a good understanding of how risks are managed – do not assume that risk management is understood. The wider findings are that the critical profession for decision-making downstream of the meteorologist is not dam engineering or hydrology, or hydraulics; it is risk management. Risk management will provide the minimum flood damage outcome where actual rainfalls match or exceed forecasts of rainfalls, that therefore risk management will provide the best approach for the likely history of flooding in the life of a dam, and provisions made for worst cases may be state of the art in risk management. The principal conclusion is the need for training in both risk management as a discipline and also in the application of risk management rules to particular dam operational scenarios.

Keywords: risk management, flood control, dam operations, deterministic thinking

Procedia PDF Downloads 71
3097 Effects of pH, Load Capacity and Contact Time in the Sulphate Sorption onto a Functionalized Mesoporous Structure

Authors: Jaime Pizarro, Ximena Castillo

Abstract:

The intensive use of water in agriculture, industry, human consumption and increasing pollution are factors that reduce the availability of water for future generations; the challenge is to advance in sustainable and low-cost solutions to reuse water and to facilitate the availability of the resource in quality and quantity. The use of new low-cost materials with sorbent capacity for pollutants is a solution that contributes to the improvement and expansion of water treatment and reuse systems. Fly ash, a residue from the combustion of coal in power plants that is produced in large quantities in newly industrialized countries, contains a high amount of silicon oxides and aluminum oxides, whose properties can be used for the synthesis of mesoporous materials. Properly functionalized, this material allows obtaining matrixes with high sorption capacity. The mesoporous materials have a large surface area, thermal and mechanical stability, uniform porous structure, and high sorption and functionalization capacities. The goal of this study was to develop hexagonal mesoporous siliceous material (HMS) for the adsorption of sulphate from industrial and mining waters. The silica was extracted from fly ash after calcination at 850 ° C, followed by the addition of water. The mesoporous structure has a surface area of 282 m2 g-1 and a size of 5.7 nm and was functionalized with ethylene diamine through of a self-assembly method. The material was characterized by Diffuse Reflectance Infrared Fourier Transform Spectroscopy (DRIFTS). The capacity of sulphate sorption was evaluated according to pH, maximum load capacity and contact time. The sulphate maximum adsorption capacity was 146.1 mg g-1, which is three times higher than commercial sorbents. The kinetic data were fitted according to a pseudo-second order model with a high coefficient of linear regression at different initial concentrations. The adsorption isotherm that best fitted the experimental data was the Freundlich model.

Keywords: fly ash, mesoporous siliceous, sorption, sulphate

Procedia PDF Downloads 145
3096 Studying the Photodegradation Behavior of Microplastics Released from Agricultural Plastic Products to the Farmland

Authors: Maryam Salehi, Gholamreza Bonyadinejad

Abstract:

The application of agricultural plastic products like mulch, greenhouse covers, and silage films is increasing due to their economic benefits in providing an early and better-quality harvest. In 2015, the 4 million tons (valued a 10.6 million USD) global market for agricultural plastic films was estimated to grow by 5.6% per year through 2030. Despite the short-term benefits provided by plastic products, their long-term sustainability issues and negative impacts on soil health are not well understood. After their removal from the field, some plastic residuals remain in the soil. Plastic residuals in farmlands may fragment to small particles called microplastics (d<5mm). The microplastics' exposure to solar radiation could alter their surface chemistry and make them susceptible to fragmentation. Thus, this study examined the photodegradation of low density polyethylene as the model microplastics that are released to the agriculture farmland. The variation of plastic’s surface chemistry, morphology, and bulk characteristics were studied after accelerated UV-A radiation experiments and sampling from an agricultural field. The Attenuated Total Reflectance Fourier Transform Spectroscopy (ATR-FTIR) and X-ray Photoelectron Spectroscopy (XPS) demonstrated the formation of oxidized surface functional groups onto the microplastics surface due to the photodegradation. The Differential Scanning Calorimetry (DSC) analysis revealed an increased crystallinity for the photodegraded microplastics compared to the new samples. The gel permeation chromatography (GPC) demonstrated the reduced molecular weight for the polymer due to the photodegradation. This study provides an important opportunity to advance understanding of soil pollution. Understanding the plastic residuals’ variations as they are left in the soil is providing a critical piece of information to better estimate the microplastics' impacts on environmental biodiversity, ecosystem sustainability, and food safety.

Keywords: soil health, plastic pollution, sustainability, photodegradation

Procedia PDF Downloads 207
3095 The Design Method of Artificial Intelligence Learning Picture: A Case Study of DCAI's New Teaching

Authors: Weichen Chang

Abstract:

To create a guided teaching method for AI generative drawing design, this paper develops a set of teaching models for AI generative drawing (DCAI), which combines learning modes such as problem-solving, thematic inquiry, phenomenon-based, task-oriented, and DFC . Through the information security AI picture book learning guided programs and content, the application of participatory action research (PAR) and interview methods to explore the dual knowledge of Context and ChatGPT (DCAI) for AI to guide the development of students' AI learning skills. In the interviews, the students highlighted five main learning outcomes (self-study, critical thinking, knowledge generation, cognitive development, and presentation of work) as well as the challenges of implementing the model. Through the use of DCAI, students will enhance their consensus awareness of generative mapping analysis and group cooperation, and they will have knowledge that can enhance AI capabilities in DCAI inquiry and future life. From this paper, it is found that the conclusions are (1) The good use of DCAI can assist students in exploring the value of their knowledge through the power of stories and finding the meaning of knowledge communication; (2) Analyze the transformation power of the integrity and coherence of the story through the context so as to achieve the tension of ‘starting and ending’; (3) Use ChatGPT to extract inspiration, arrange story compositions, and make prompts that can communicate with people and convey emotions. Therefore, new knowledge construction methods will be one of the effective methods for AI learning in the face of artificial intelligence, providing new thinking and new expressions for interdisciplinary design and design education practice.

Keywords: artificial intelligence, task-oriented, contextualization, design education

Procedia PDF Downloads 14
3094 O-LEACH: The Problem of Orphan Nodes in the LEACH of Routing Protocol for Wireless Sensor Networks

Authors: Wassim Jerbi, Abderrahmen Guermazi, Hafedh Trabelsi

Abstract:

The optimum use of coverage in wireless sensor networks (WSNs) is very important. LEACH protocol called Low Energy Adaptive Clustering Hierarchy, presents a hierarchical clustering algorithm for wireless sensor networks. LEACH is a protocol that allows the formation of distributed cluster. In each cluster, LEACH randomly selects some sensor nodes called cluster heads (CHs). The selection of CHs is made with a probabilistic calculation. It is supposed that each non-CH node joins a cluster and becomes a cluster member. Nevertheless, some CHs can be concentrated in a specific part of the network. Thus, several sensor nodes cannot reach any CH. to solve this problem. We created an O-LEACH Orphan nodes protocol, its role is to reduce the sensor nodes which do not belong the cluster. The cluster member called Gateway receives messages from neighboring orphan nodes. The gateway informs CH having the neighboring nodes that not belong to any group. However, Gateway called (CH') attaches the orphaned nodes to the cluster and then collected the data. O-Leach enables the formation of a new method of cluster, leads to a long life and minimal energy consumption. Orphan nodes possess enough energy and seeks to be covered by the network. The principal novel contribution of the proposed work is O-LEACH protocol which provides coverage of the whole network with a minimum number of orphaned nodes and has a very high connectivity rates.As a result, the WSN application receives data from the entire network including orphan nodes. The proper functioning of the Application requires, therefore, management of intelligent resources present within each the network sensor. The simulation results show that O-LEACH performs better than LEACH in terms of coverage, connectivity rate, energy and scalability.

Keywords: WSNs; routing; LEACH; O-LEACH; Orphan nodes; sub-cluster; gateway; CH’

Procedia PDF Downloads 360