Search results for: machine learning approach for neurological disorder assessment
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24557

Search results for: machine learning approach for neurological disorder assessment

17417 Formulation and in Vitro Evaluation of Cubosomes Containing CeO₂ Nanoparticles Loaded with Glatiramer Acetate Drug

Authors: Akbar Esmaeili, Zahra Salarieh

Abstract:

Cerium oxide nanoparticles (nano-series) are used as catalysts in industrial applications due to their free radical scavenging properties. Given that free radicals play an essential role in the pathology of many neurological diseases, we investigated the use of nanocrystals as a potential therapeutic agent for oxidative damage. This project synthesized nano-series from a new and environmentally friendly bio-pathway. Investigation of cerium nitrate in culture medium containing inoculated Lactobacillus acidophilus strain before incubation produces nano-series. Loaded with glatiramer acetate (GA) was formed by coating carboxymethylcellulose (CMC) and CeO2. FE-SEM analysis showed nano-series in the 9-11 nm range, spherical shape, and uniform particle size distribution. Cubic nanoparticles containing anti-multiple sclerosis (anti-Ms) treatment called GA were used. Glycerol monostearate (GMS) was used as a fat base, and evening primrose extract was used as an anti-inflammatory in cubosomes. Design-Expert® software was used to study the effects of different formulation factors on the properties of GAloaded cubic dispersions. Thirty GA-labeled cubic dispersions were prepared with GA-labeled carboxymethylcellulose and evaluated in vitro. The results showed an average nano-series size of 89.02 and a zeta potential of -49.9. Cubosomes containing GA-CMC/CeO2 showed a stable release profile for 180 min. The results showed that cubosomes containing GA-CMC/CeO2 could be a promising drug carrier with normal release behavior.

Keywords: ciochemistry, biotechnology, molecular, biology

Procedia PDF Downloads 27
17416 Third Party Logistics (3PL) Selection Criteria for an Indian Heavy Industry Using SEM

Authors: Nadama Kumar, P. Parthiban, T. Niranjan

Abstract:

In the present paper, we propose an incorporated approach for 3PL supplier choice that suits the distinctive strategic needs of the outsourcing organization in southern part of India. Four fundamental criteria have been used in particular Performance, IT, Service and Intangible. These are additionally subdivided into fifteen sub-criteria. The proposed strategy coordinates Structural Equation Modeling (SEM) and Non-additive Fuzzy Integral strategies. The presentation of fluffiness manages the unclearness of human judgments. The SEM approach has been used to approve the determination criteria for the proposed show though the Non-additive Fuzzy Integral approach uses the SEM display contribution to assess a supplier choice score. The case organization has a exclusive vertically integrated assembly that comprises of several companies focusing on a slight array of the value chain. To confirm manufacturing and logistics proficiency, it significantly relies on 3PL suppliers to attain supply chain superiority. However, 3PL supplier selection is an intricate decision-making procedure relating multiple selection criteria. The goal of this work is to recognize the crucial 3PL selection criteria by using the non-additive fuzzy integral approach. Unlike the outmoded multi criterion decision-making (MCDM) methods which frequently undertake independence among criteria and additive importance weights, the nonadditive fuzzy integral is an effective method to resolve the dependency among criteria, vague information, and vital fuzziness of human judgment. In this work, we validate an empirical case that engages the nonadditive fuzzy integral to assess the importance weight of selection criteria and indicate the most suitable 3PL supplier.

Keywords: 3PL, non-additive fuzzy integral approach, SEM, fuzzy

Procedia PDF Downloads 269
17415 Developing Writing Skills of Learners with Persistent Literacy Difficulties through the Explicit Teaching of Grammar in Context: Action Research in a Welsh Secondary School

Authors: Jean Ware, Susan W. Jones

Abstract:

Background: The benefits of grammar instruction in the teaching of writing is contested in most English speaking countries. A majority of Anglophone countries abandoned the teaching of grammar in the 1950s based on the conclusions that it had no positive impact on learners’ development of reading, writing, and language. Although the decontextualised teaching of grammar is not helpful in improving writing, a curriculum with a focus on grammar in an embedded and meaningful way can help learners develop their understanding of the mechanisms of language. Although British learners are generally not taught grammar rules explicitly, learners in schools in France, the Netherlands, and Germany are taught explicitly about the structure of their own language. Exposing learners to grammatical analysis can help them develop their understanding of language. Indeed, if learners are taught that each part of speech has an identified role in the sentence. This means that rather than have to memorise lists of words or spelling patterns, they can focus on determining each word or phrase’s task in the sentence. These processes of categorisation and deduction are higher order thinking skills. When considering definitions of dyslexia available in Great Britain, the explicit teaching of grammar in context could help learners with persistent literacy difficulties. Indeed, learners with dyslexia often develop strengths in problem solving; the teaching of grammar could, therefore, help them develop their understanding of language by using analytical and logical thinking. Aims: This study aims at gaining a further understanding of how the explicit teaching of grammar in context can benefit learners with persistent literacy difficulties. The project is designed to identify ways of adapting existing grammar focussed teaching materials so that learners with specific learning difficulties such as dyslexia can use them to further develop their writing skills. It intends to improve educational practice through action, analysis and reflection. Research Design/Methods: The project, therefore, uses an action research design and multiple sources of evidence. The data collection tools used were standardised test data, teacher assessment data, semi-structured interviews, learners’ before and after attempts at a writing task at the beginning and end of the cycle, documentary data and lesson observation carried out by a specialist teacher. Existing teaching materials were adapted for use with five Year 9 learners who had experienced persistent literacy difficulties from primary school onwards. The initial adaptations included reducing the amount of content to be taught in each lesson, and pre teaching some of the metalanguage needed. Findings: Learners’ before and after attempts at the writing task were scored by a colleague who did not know the order of the attempts. All five learners’ scores were higher on the second writing task. Learners reported that they had enjoyed the teaching approach. They also made suggestions to be included in the second cycle, as did the colleague who carried out observations. Conclusions: Although this is a very small exploratory study, these results suggest that adapting grammar focused teaching materials shows promise for helping learners with persistent literacy difficulties develop their writing skills.

Keywords: explicit teaching of grammar in context, literacy acquisition, persistent literacy difficulties, writing skills

Procedia PDF Downloads 144
17414 Development of Quality Assessment Tool to Gauge Fire Response Activities of Emergency Personnel in Denmark

Authors: Jennifer E. Lynette

Abstract:

The purpose of this study is to develop a nation-wide assessment tool to gauge the quality and efficiency of response activities by emergency personnel to fires in Denmark. Current fire incident reports lack detailed information that can lead to breakthroughs in research and improve emergency response efforts. Information generated from the report database is analyzed and assessed for efficiency and quality. By utilizing information collection gaps in the incident reports, an improved, indepth, and streamlined quality gauging system is developed for use by fire brigades. This study pinpoints previously unrecorded factors involved in the response phases of a fire. Variables are recorded and ranked based on their influence to event outcome. By assessing and measuring these data points, quality standards are developed. These quality standards include details of the response phase previously overlooked which individually and cumulatively impact the overall success of a fire response effort. Through the application of this tool and implementation of associated quality standards at Denmark’s fire brigades, there is potential to increase efficiency and quality in the preparedness and response phases, thereby saving additional lives, property, and resources.

Keywords: emergency management, fire, preparedness, quality standards, response

Procedia PDF Downloads 306
17413 Role of Artificial Intelligence in Nano Proteomics

Authors: Mehrnaz Mostafavi

Abstract:

Recent advances in single-molecule protein identification (ID) and quantification techniques are poised to revolutionize proteomics, enabling researchers to delve into single-cell proteomics and identify low-abundance proteins crucial for biomedical and clinical research. This paper introduces a different approach to single-molecule protein ID and quantification using tri-color amino acid tags and a plasmonic nanopore device. A comprehensive simulator incorporating various physical phenomena was designed to predict and model the device's behavior under diverse experimental conditions, providing insights into its feasibility and limitations. The study employs a whole-proteome single-molecule identification algorithm based on convolutional neural networks, achieving high accuracies (>90%), particularly in challenging conditions (95–97%). To address potential challenges in clinical samples, where post-translational modifications affecting labeling efficiency, the paper evaluates protein identification accuracy under partial labeling conditions. Solid-state nanopores, capable of processing tens of individual proteins per second, are explored as a platform for this method. Unlike techniques relying solely on ion-current measurements, this approach enables parallel readout using high-density nanopore arrays and multi-pixel single-photon sensors. Convolutional neural networks contribute to the method's versatility and robustness, simplifying calibration procedures and potentially allowing protein ID based on partial reads. The study also discusses the efficacy of the approach in real experimental conditions, resolving functionally similar proteins. The theoretical analysis, protein labeler program, finite difference time domain calculation of plasmonic fields, and simulation of nanopore-based optical sensing are detailed in the methods section. The study anticipates further exploration of temporal distributions of protein translocation dwell-times and the impact on convolutional neural network identification accuracy. Overall, the research presents a promising avenue for advancing single-molecule protein identification and quantification with broad applications in proteomics research. The contributions made in methodology, accuracy, robustness, and technological exploration collectively position this work at the forefront of transformative developments in the field.

Keywords: nano proteomics, nanopore-based optical sensing, deep learning, artificial intelligence

Procedia PDF Downloads 59
17412 Smartphone Addiction and Reaction Time in Geriatric Population

Authors: Anjali N. Shete, G. D. Mahajan, Nanda Somwanshi

Abstract:

Context: Smartphones are the new generation of mobile phones; they have emerged over the last few years. Technology has developed so much that it has become part of our life and mobile phones are one of them. These smartphones are equipped with the capabilities to display photos, play games, watch videos and navigation, etc. The advances have a huge impact on many walks of life. The adoption of new technology has been challenging for the elderly. But, the elder population is also moving towards digitally connected lives. As age advances, there is a decline in the motor and cognitive functions of the brain, and hence the reaction time is affected. The study was undertaken to assess the usefulness of smartphones in improving cognitive functions. Aims and Objectives: The aim of the study was to observe the effects of smartphone addiction on reaction time in elderly population Material and Methods: This is an experimental study. 100 elderly subjects were enrolled in this study randomly from urban areas. They all were using smartphones for several hours a day. They were divided into two groups according to the scores of the mobile phone addiction scale (MPAS). Simple reaction time was estimated by the Ruler drop method. The reaction time was then calculated for each subject in both groups. The data were analyzed using mean, standard deviation, and Pearson correlation test. Results: The mean reaction time in Group A is 0.27+ 0.040 and in Group B is 0.20 + 0.032. The values show a statistically significant change in reaction time. Conclusion: Group A with a high MPAS score has a low reaction time compared to Group B with a low MPAS score. Hence, it can be concluded that the use of smartphones in the elderly is useful, delaying the neurological decline, and smarten the brain.

Keywords: smartphones, MPAS, reaction time, elderly population

Procedia PDF Downloads 160
17411 Conjunctive Management of Surface and Groundwater Resources under Uncertainty: A Retrospective Optimization Approach

Authors: Julius M. Ndambuki, Gislar E. Kifanyi, Samuel N. Odai, Charles Gyamfi

Abstract:

Conjunctive management of surface and groundwater resources is a challenging task due to the spatial and temporal variability nature of hydrology as well as hydrogeology of the water storage systems. Surface water-groundwater hydrogeology is highly uncertain; thus it is imperative that this uncertainty is explicitly accounted for, when managing water resources. Various methodologies have been developed and applied by researchers in an attempt to account for the uncertainty. For example, simulation-optimization models are often used for conjunctive water resources management. However, direct application of such an approach in which all realizations are considered at each iteration of the optimization process leads to a very expensive optimization in terms of computational time, particularly when the number of realizations is large. The aim of this paper, therefore, is to introduce and apply an efficient approach referred to as Retrospective Optimization Approximation (ROA) that can be used for optimizing conjunctive use of surface water and groundwater over a multiple hydrogeological model simulations. This work is based on stochastic simulation-optimization framework using a recently emerged technique of sample average approximation (SAA) which is a sampling based method implemented within the Retrospective Optimization Approximation (ROA) approach. The ROA approach solves and evaluates a sequence of generated optimization sub-problems in an increasing number of realizations (sample size). Response matrix technique was used for linking simulation model with optimization procedure. The k-means clustering sampling technique was used to map the realizations. The methodology is demonstrated through the application to a hypothetical example. In the example, the optimization sub-problems generated were solved and analysed using “Active-Set” core optimizer implemented under MATLAB 2014a environment. Through k-means clustering sampling technique, the ROA – Active Set procedure was able to arrive at a (nearly) converged maximum expected total optimal conjunctive water use withdrawal rate within a relatively few number of iterations (6 to 7 iterations). Results indicate that the ROA approach is a promising technique for optimizing conjunctive water use of surface water and groundwater withdrawal rates under hydrogeological uncertainty.

Keywords: conjunctive water management, retrospective optimization approximation approach, sample average approximation, uncertainty

Procedia PDF Downloads 218
17410 Sexual and Reproductive Health for Women in Africa: Adopting a Human Rights Based Approach to Overcome Cultural Barriers

Authors: Seraphina Bakta

Abstract:

In many societies in Africa, it is a taboo to speak, let alone to practice or in any way to engage in matters relating to sexual and reproductive health. For instance, girls using contraceptives may be labeled prostitutes, and married women using family planning methods may be divorced on account that they are disobedient to their husbands as they do not want to bear children. As such, sexual and reproductive health as a right is still very far from reality to many men and women. To a large extent, the objections are mainly backed up in culture, which is deeply rooted in many African traditions. While such culture have both the good and bad side, the African Charter on Human and Peoples Rights has identified the bad ones as’ harmful cultural practices. This paper argues that, while cultural norms may hinder the realization of human rights, adopting a human rights based approach to address harmful cultural practices is likely, the best approach to realizing women’s rights to sexual and reproductive health rights in Africa.

Keywords: rights, culture, health, women

Procedia PDF Downloads 107
17409 Improving Student Learning in a Math Bridge Course through Computer Algebra Systems

Authors: Alejandro Adorjan

Abstract:

Universities are motivated to understand the factor contributing to low retention of engineering undergraduates. While precollege students for engineering increases, the number of engineering graduates continues to decrease and attrition rates for engineering undergraduates remains high. Calculus 1 (C1) is the entry point of most undergraduate Engineering Science and often a prerequisite for Computing Curricula courses. Mathematics continues to be a major hurdle for engineering students and many students who drop out from engineering cite specifically Calculus as one of the most influential factors in that decision. In this context, creating course activities that increase retention and motivate students to obtain better final results is a challenge. In order to develop several competencies in our students of Software Engineering courses, Calculus 1 at Universidad ORT Uruguay focuses on developing several competencies such as capacity of synthesis, abstraction, and problem solving (based on the ACM/AIS/IEEE). Every semester we try to reflect on our practice and try to answer the following research question: What kind of teaching approach in Calculus 1 can we design to retain students and obtain better results? Since 2010, Universidad ORT Uruguay offers a six-week summer noncompulsory bridge course of preparatory math (to bridge the math gap between high school and university). Last semester was the first time the Department of Mathematics offered the course while students were enrolled in C1. Traditional lectures in this bridge course lead to just transcribe notes from blackboard. Last semester we proposed a Hands On Lab course using Geogebra (interactive geometry and Computer Algebra System (CAS) software) as a Math Driven Development Tool. Students worked in a computer laboratory class and developed most of the tasks and topics in Geogebra. As a result of this approach, several pros and cons were found. It was an excessive amount of weekly hours of mathematics for students and, as the course was non-compulsory; the attendance decreased with time. Nevertheless, this activity succeeds in improving final test results and most students expressed the pleasure of working with this methodology. This teaching technology oriented approach strengthens student math competencies needed for Calculus 1 and improves student performance, engagement, and self-confidence. It is important as a teacher to reflect on our practice, including innovative proposals with the objective of engaging students, increasing retention and obtaining better results. The high degree of motivation and engagement of participants with this methodology exceeded our initial expectations, so we plan to experiment with more groups during the summer so as to validate preliminary results.

Keywords: calculus, engineering education, PreCalculus, Summer Program

Procedia PDF Downloads 275
17408 A Personality-Based Behavioral Analysis on eSports

Authors: Halkiopoulos Constantinos, Gkintoni Evgenia, Koutsopoulou Ioanna, Antonopoulou Hera

Abstract:

E-sports and e-gaming have emerged in recent years since the increase in internet use have become universal and e-gamers are the new reality in our homes. The excessive involvement of young adults with e-sports has already been revealed and the adverse consequences have been reported in researches in the past few years, but the issue has not been fully studied yet. The present research is conducted in Greece and studies the psychological profile of video game players and provides information on personality traits, habits and emotional status that affect online gamers’ behaviors in order to help professionals and policy makers address the problem. Three standardized self-report questionnaires were administered to participants who were young male and female adults aged from 19-26 years old. The Profile of Mood States (POMS) scale was used to evaluate people’s perceptions of their everyday life mood; the personality features that can trace back to people’s habits and anticipated reactions were measured by Eysenck Personality Questionnaire (EPQ), and the Trait Emotional Intelligence Questionnaire (TEIQue) was used to measure which cognitive (gamers’ beliefs) and emotional parameters (gamers’ emotional abilities) mainly affected/ predicted gamers’ behaviors and leisure time activities?/ gaming behaviors. Data mining techniques were used to analyze the data, which resulted in machine learning algorithms that were included in the software package R. The research findings attempt to designate the effect of personality traits, emotional status and emotional intelligence influence and correlation with e-sports, gamers’ behaviors and help policy makers and stakeholders take action, shape social policy and prevent the adverse consequences on young adults. The need for further research, prevention and treatment strategies is also addressed.

Keywords: e-sports, e-gamers, personality traits, POMS, emotional intelligence, data mining, R

Procedia PDF Downloads 216
17407 Prevention of Road Accidents by Computerized Drowsiness Detection System

Authors: Ujjal Chattaraj, P. C. Dasbebartta, S. Bhuyan

Abstract:

This paper aims to propose a method to detect the action of the driver’s eyes, using the concept of face detection. There are three major key contributing methods which can rapidly process the framework of the facial image and hence produce results which further can program the reactions of the vehicles as pre-programmed for the traffic safety. This paper compares and analyses the methods on the basis of their reaction time and their ability to deal with fluctuating images of the driver. The program used in this study is simple and efficient, built using the AdaBoost learning algorithm. Through this program, the system would be able to discard background regions and focus on the face-like regions. The results are analyzed on a common computer which makes it feasible for the end users. The application domain of this experiment is quite wide, such as detection of drowsiness or influence of alcohols in drivers or detection for the case of identification.

Keywords: AdaBoost learning algorithm, face detection, framework, traffic safety

Procedia PDF Downloads 144
17406 A Time-Varying and Non-Stationary Convolution Spectral Mixture Kernel for Gaussian Process

Authors: Kai Chen, Shuguang Cui, Feng Yin

Abstract:

Gaussian process (GP) with spectral mixture (SM) kernel demonstrates flexible non-parametric Bayesian learning ability in modeling unknown function. In this work a novel time-varying and non-stationary convolution spectral mixture (TN-CSM) kernel with a significant enhancing of interpretability by using process convolution is introduced. A way decomposing the SM component into an auto-convolution of base SM component and parameterizing it to be input dependent is outlined. Smoothly, performing a convolution between two base SM component yields a novel structure of non-stationary SM component with much better generalized expression and interpretation. The TN-CSM perfectly allows compatibility with the stationary SM kernel in terms of kernel form and spectral base ignored and confused by previous non-stationary kernels. On synthetic and real-world datatsets, experiments show the time-varying characteristics of hyper-parameters in TN-CSM and compare the learning performance of TN-CSM with popular and representative non-stationary GP.

Keywords: Gaussian process, spectral mixture, non-stationary, convolution

Procedia PDF Downloads 179
17405 Survivability of Maneuvering Aircraft against Air to Air Infrared Missile

Authors: Ji-Yeul Bae, Hyung Mo Bae, Jihyuk Kim, Hyung Hee Cho

Abstract:

An air to air infrared missile poses a significant threat to the survivability of an aircraft due to an advanced sensitivity of sensor and maneuverability of the missile. Therefore, recent military aircraft is equipped with MAW (Missile Approach Warning) to take an evasive maneuver and to deploy countermeasures like chaff and flare. In this research, an effect of MAW sensitivity and resulting evasive maneuver on the survivability of the fighter aircraft is studied. A single engine fighter jet with Mach 0.9 flying at an altitude of 5 km is modeled in the research and infrared signature of the aircraft is calculated by numerical simulation. The survivability is assessed in terms of lethal range. The MAW sensitivity and maneuverability of an aircraft is used as variables. The result showed that improvement in survivability mainly achieved when the missile approach from the side of the aircraft. And maximum 30% increase in survivability of the aircraft is achieved when existence of the missile is noticed at 7 km distance. As a conclusion, sensitivity of the MAW seems to be more important factor than the maneuverability of the aircraft in terms of the survivability.

Keywords: air to air missile, missile approach warning, lethal range, survivability

Procedia PDF Downloads 549
17404 The Significance of Computer Assisted Language Learning in Teaching English Grammar in Tribal Zone of Chhattisgarh

Authors: Yogesh Kumar Tiwari

Abstract:

Chhattisgarh has realized the fundamental role of information and communication technology in the globalized world where knowledge is at the top for the growth and intellectual development. They are spreading so widely that one feels lagging behind if not using them. The influence of these radiating and technological tools has encompassed all aspects of the educational, business, and economic sectors of our world. Undeniably the computer has not only established itself globally in all walks of life but has acquired a fundamental role of paramount importance in the educational process also. This role is getting all pervading and more powerful as computers are being manufactured to be cheaper, smaller in size, adaptable and easy to handle. Computers are becoming indispensable to teachers because of their enormous capabilities and extensive competence. This study aims at observing the effect of using computer based software program of English language on the achievement of undergraduate level students studying in tribal area like Sarguja Division, Chhattisgarh, India. To testify the effect of an innovative teaching in the graduate classroom in tribal area 50 students were randomly selected and separated into two groups. The first group of 25 students were taught English grammar i.e., passive voice/narration, through traditional method using chalk and blackboard asking some formal questions. The second group, the experimental one, was taught English grammar i.e., passive voice/narration, using computer, projector with power point presentation of grammatical items. The statistical analysis was done on the students’ learning capacities and achievement. The result was extremely mesmerizing not only for the teacher but for taught also. The process of the recapitulation demonstrated that the students of experimental group responded the answers of the questions enthusiastically with innovative sense of learning. In light of the findings of the study, it was recommended that teachers and professors of English ought to use self-made instructional program in their teaching process particularly in tribal areas.

Keywords: achievement computer assisted language learning, use of instructional program

Procedia PDF Downloads 134
17403 Understanding the Experiences of School Teachers and Administrators Involved in a Multi-Sectoral Approach to the Creation of a Physical Literacy Enriched Community

Authors: M. Louise Humbert, Karen E. Chad, Natalie E. Houser, Marta E. Erlandson

Abstract:

Physical literacy is the motivation, confidence, physical competence, knowledge, and understanding to value and takes responsibility for engagement in physical activities for life. In recent years, physical literacy has emerged as a determinant of health, promoting a positive lifelong physical activity trajectory. Physical literacy’s holistic approach and emphasis on the intrinsic valuation of movement provide an encouraging avenue for intervention among children to develop competent and confident movers. Although there is research on physical literacy interventions, no evidence exists on the outcomes of multi-sectoral interventions involving a combination of home, school, and community contexts. Since children interact with and in a wide range of contexts (home, school, community) daily, interventions designed to address a combination of these contexts are critical to the development of physical literacy. Working with school administrators and teachers, sports and recreation leaders, and community members, our team of university and community researchers conducted and evaluated one of the first multi-contextual and multi-sectoral physical literacy interventions in Canada. Schools played a critical role in this multi-sector intervention, and in this project, teachers and administrators focused their actions on developing physical literacy in students 10 to 14 years of age through the instruction of physical literacy-focused physical education lessons. Little is known about the experiences of educators when they work alongside an array of community representatives to develop physical literacy in school-aged children. Given the uniqueness of this intervention, we sought to answer the question, ‘What were the experiences of school-based educators involved in a multi-sectoral partnership focused on creating a physical literacy enriched community intervention?’ A thematic analysis approach was used to analyze data collected from interviews with educators and administrators, informal conversations, documents, and observations at workshops and meetings. Results indicated that schools and educators played the largest role in this multi-sector intervention. Educators initially reported a limited understanding of physical literacy and expressed a need for resources linked to the physical education curriculum. Some anxiety was expressed by the teachers as their students were measured, and educators noted they wanted to increase their understanding and become more involved in the assessment of physical literacy. Teachers reported that the intervention’s focus on physical literacy positively impacted the scheduling and their instruction of physical education. Administrators shared their desire for school and division-level actions targeting physical literacy development like the current focus on numeracy and literacy, treaty education, and safe schools. As this was one of the first multi-contextual and multi-sectoral physical literacy interventions, it was important to document creation and delivery experiences to encourage future growth in the area and develop suggested best practices.

Keywords: physical literacy, multi sector intervention, physical education, teachers

Procedia PDF Downloads 89
17402 Spatial Distribution, Characteristics, and Pollution Risk Assessment of Microplastics in Sediments from Karnaphuli River Estuary, Bangladesh

Authors: Md. Refat Jahan Rakiba, M. Belal Hossaina, Rakesh Kumarc, Md. Akram Ullaha, Sultan Al Nahiand, Nazmun Naher Rimaa, Tasrina Rabia Choudhury, Samia Islam Libaf, Jimmy Yub, Mayeen Uddin Khandakerg, Abdelmoneim Suliemanh, Mohamed Mahmoud Sayedi

Abstract:

Microplastics (MPs) have become an emerging global pollutant due to their wide spread and dispersion and potential threats to marine ecosystems. However, studies on MPs of estuarine and coastal ecosystems of Bangladesh are very limited or not available. In this study, we conducted the first study on the abundance, distribution, characteristics and potential risk assessment of microplastics in the sediment of Karnaphuli River estuary, Bangladesh. Microplastic particles were extracted from sediments of 30 stations along the estuary by density separation, and then enumerated and characterize by using steromicroscope and Fourier Transform Infrared (FT-IR) spectroscopy. In the collected sediment, the number of MPs varied from 22.29 - 59.5 items kg−1 of dry weight (DW) with an average of 1177 particles kg−1 DW. The mean abundance was higher in the downstream and left bank of estuary where the predominant shape, colour, and size of MPs were films (35%), white (19%), and >5000 μm (19%), respectively. The main polymer types were polyethylene terephthalate, polystyrene, polyethylene, cellulose, and nylon. MPs were found to pose risks (low to high) in the sediment of the estuary, with the highest risk occuring at one station near a sewage outlet, according to the results of risk analyses using the pollution risk index (PRI), polymer risk index (H), contamination factors (CFs), and pollution load index (PLI). The single value index, PLI clearly demonastated that all sampling sites were considerably polluted (as PLI >1) with microplastics. H values showed toxic polymers even in lower proportions possess higher polymeric hazard scores and vice versa. This investigation uncovered new insights on the status of MPs in the sediments of Karnaphuli River estuary, laying the groundwork for future research and control of microplastic pollution and management.

Keywords: microplastics, polymers, pollution risk assessment, Karnaphuli esttuary

Procedia PDF Downloads 58
17401 Computing Machinery and Legal Intelligence: Towards a Reflexive Model for Computer Automated Decision Support in Public Administration

Authors: Jacob Livingston Slosser, Naja Holten Moller, Thomas Troels Hildebrandt, Henrik Palmer Olsen

Abstract:

In this paper, we propose a model for human-AI interaction in public administration that involves legal decision-making. Inspired by Alan Turing’s test for machine intelligence, we propose a way of institutionalizing a continuous working relationship between man and machine that aims at ensuring both good legal quality and higher efficiency in decision-making processes in public administration. We also suggest that our model enhances the legitimacy of using AI in public legal decision-making. We suggest that case loads in public administration could be divided between a manual and an automated decision track. The automated decision track will be an algorithmic recommender system trained on former cases. To avoid unwanted feedback loops and biases, part of the case load will be dealt with by both a human case worker and the automated recommender system. In those cases an experienced human case worker will have the role of an evaluator, choosing between the two decisions. This model will ensure that the algorithmic recommender system is not compromising the quality of the legal decision making in the institution. It also enhances the legitimacy of using algorithmic decision support because it provides justification for its use by being seen as superior to human decisions when the algorithmic recommendations are preferred by experienced case workers. The paper outlines in some detail the process through which such a model could be implemented. It also addresses the important issue that legal decision making is subject to legislative and judicial changes and that legal interpretation is context sensitive. Both of these issues requires continuous supervision and adjustments to algorithmic recommender systems when used for legal decision making purposes.

Keywords: administrative law, algorithmic decision-making, decision support, public law

Procedia PDF Downloads 200
17400 Characterizing Nanoparticles Generated from the Different Working Type and the Stack Flue during 3D Printing Process

Authors: Kai-Jui Kou, Tzu-Ling Shen, Ying-Fang Wang

Abstract:

The objectives of the present study are to characterize nanoparticles generated from the different working type in 3D printing room and the stack flue during 3D printing process. The studied laboratory (10.5 m× 7.2 m × 3.2 m) with a ventilation rate of 500 m³/H is installed a 3D metal printing machine. Direct-reading instrument of a scanning mobility particle sizer (SMPS, Model 3082, TSI Inc., St. Paul, MN, USA) was used to conduct static sampling for nanoparticle number concentration and particle size distribution measurements. The SMPS obtained particle number concentration at every 3 minutes, the diameter of the SMPS ranged from 11~372 nm when the aerosol and sheath flow rates were set at 0.6 and 6 L/min, respectively. The concentrations of background, printing process, clearing operation, and screening operation were performed in the laboratory. On the other hand, we also conducted nanoparticle measurement on the 3D printing machine's stack flue to understand its emission characteristics. Results show that the nanoparticles emitted from the different operation process were the same distribution in the form of the uni-modal with number median diameter (NMD) as approximately 28.3 nm to 29.6 nm. The number concentrations of nanoparticles were 2.55×10³ count/cm³ in laboratory background, 2.19×10³ count/cm³ during printing process, 2.29×10³ count/cm³ during clearing process, 3.05×10³ count/cm³ during screening process, 2.69×10³ count/cm³ in laboratory background after printing process, and 6.75×10³ outside laboratory, respectively. We found that there are no emission nanoparticles during the printing process. However, the number concentration of stack flue nanoparticles in the ongoing print is 1.13×10⁶ count/cm³, and that of the non-printing is 1.63×10⁴ count/cm³, with a NMD of 458 nm and 29.4 nm, respectively. It can be confirmed that the measured particle size belongs to easily penetrate the filter in theory during the printing process, even though the 3D printer has a high-efficiency filtration device. Therefore, it is recommended that the stack flue of the 3D printer would be equipped with an appropriate dust collection device to prevent the operators from exposing these hazardous particles.

Keywords: nanoparticle, particle emission, 3D printing, number concentration

Procedia PDF Downloads 162
17399 Efficacy Testing of a Product in Reducing Facial Hyperpigmentation and Photoaging after a 12-Week Use

Authors: Nalini Kaul, Barrie Drewitt, Elsie Kohoot

Abstract:

Hyperpigmentation is the third most common pigmentary disorder where dermatologic treatment is sought. It affects all ages resulting in skin darkening because of melanin accumulation. An uneven skin tone because of either exposure to the sun (solar lentigos/age spots/sun spots or skin disruption following acne, or rashes (post-inflammatory hyperpigmentation -PIH) or hormonal changes (melasma) can lead to significant psychosocial impairment. Dyschromia is a result of various alterations in biochemical processes regulating melanogenesis. Treatments include the daily use of sunscreen with lightening, brightening, and exfoliating products. Depigmentation is achieved by various depigmenting agents: common examples are hydroquinone, arbutin, azelaic acid, aloesin, mulberry, licorice extracts, kojic acid, niacinamide, ellagic acid, arbutin, green tea, turmeric, soy, ascorbic acid, and tranexamic acid. These agents affect pigmentation by interfering with mechanisms before, during, and after melanin synthesis. While immediate correction is much sought after, patience and diligence are key. Our objective was to assess the effects of a facial product with pigmentation treatment and UV protection in 35 healthy F (35-65y), meeting the study criteria. Subjects with mild to moderate hyperpigmentation and fine lines with no use of skin-lightening products in the last six months or any dermatological procedures in the last twelve months before the study started were included. Efficacy parameters included expert clinical grading for hyperpigmentation, radiance, skin tone & smoothness, fine lines, and wrinkles bioinstrumentation (Corneometer®, Colorimeter®), digital photography and imaging (Visia-CR®), and self-assessment questionnaires. Safety included grading for erythema, edema, dryness & peeling and self-assessments for itching, stinging, tingling, and burning. Our results showed statistically significant improvement in clinical grading scores, bioinstrumentation, and digital photos for hyperpigmentation-brown spots, fine lines/wrinkles, skin tone, radiance, pores, skin smoothness, and overall appearance compared to baseline. The product was also well-tolerated and liked by subjects. Conclusion: Facial hyperpigmentation is of great concern, and treatment strategies are increasingly sought. Clinical trials with both subjective and objective assessments, imaging analyses, and self-perception are essential to distinguish evidence-based products. The multifunctional cosmetic product tested in this clinical study showed efficacy, tolerability, and subject satisfaction in reducing hyperpigmentation and global photoaging.

Keywords: hyperpigmentation; photoaging, clinical testing, expert visual evaluations, bio-instruments

Procedia PDF Downloads 60
17398 An ANOVA Approach for the Process Parameters Optimization of Al-Si Alloy Sand Casting

Authors: Manjinder Bajwa, Mahipal Singh, Manish Nagpal

Abstract:

This research paper aims to propose a novel approach using ANOVA technique for the strategic investigation of process parameters and their effects on the mechanical properties of Aluminium alloy cast. The two process parameters considered here were permeability of sand and pouring temperature of aluminium alloy. ANOVA has been employed for the first time to determine the effects of these selected parameters on the impact strength of alloy. The experimental results show that this proposed technique has great potential for analyzing sand casting process. Using this approach we have determined the treatment mean square, response mean square and mean square of error as 8.54, 8.255 and 0.435 respectively. The research concluded that at the 5% level of significance, permeability of sand is the more significant parameter influencing the impact strength of cast alloy.

Keywords: aluminium alloy, pouring temperature, permeability of sand, impact strength, ANOVA

Procedia PDF Downloads 429
17397 Introduction of an Approach of Complex Virtual Devices to Achieve Device Interoperability in Smart Building Systems

Authors: Thomas Meier

Abstract:

One of the major challenges for sustainable smart building systems is to support device interoperability, i.e. connecting sensor or actuator devices from different vendors, and present their functionality to the external applications. Furthermore, smart building systems are supposed to connect with devices that are not available yet, i.e. devices that become available on the market sometime later. It is of vital importance that a sustainable smart building platform provides an appropriate external interface that can be leveraged by external applications and smart services. An external platform interface must be stable and independent of specific devices and should support flexible and scalable usage scenarios. A typical approach applied in smart home systems is based on a generic device interface used within the smart building platform. Device functions, even of rather complex devices, are mapped to that generic base type interface by means of specific device drivers. Our new approach, presented in this work, extends that approach by using the smart building system’s rule engine to create complex virtual devices that can represent the most diverse properties of real devices. We examined and evaluated both approaches by means of a practical case study using a smart building system that we have developed. We show that the solution we present allows the highest degree of flexibility without affecting external application interface stability and scalability. In contrast to other systems our approach supports complex virtual device configuration on application layer (e.g. by administration users) instead of device configuration at platform layer (e.g. platform operators). Based on our work, we can show that our approach supports almost arbitrarily flexible use case scenarios without affecting the external application interface stability. However, the cost of this approach is additional appropriate configuration overhead and additional resource consumption at the IoT platform level that must be considered by platform operators. We conclude that the concept of complex virtual devices presented in this work can be applied to improve the usability and device interoperability of sustainable intelligent building systems significantly.

Keywords: Internet of Things, smart building, device interoperability, device integration, smart home

Procedia PDF Downloads 252
17396 Artificial Neural Networks Based Calibration Approach for Six-Port Receiver

Authors: Nadia Chagtmi, Nejla Rejab, Noureddine Boulejfen

Abstract:

This paper presents a calibration approach based on artificial neural networks (ANN) to determine the envelop signal (I+jQ) of a six-port based receiver (SPR). The memory effects called also dynamic behavior and the nonlinearity brought by diode based power detector have been taken into consideration by the ANN. Experimental set-up has been performed to validate the efficiency of this method. The efficiency of this approach has been confirmed by the obtained results in terms of waveforms. Moreover, the obtained error vector magnitude (EVM) and the mean absolute error (MAE) have been calculated in order to confirm and to test the ANN’s performance to achieve I/Q recovery using the output voltage detected by the power based detector. The baseband signal has been recovered using ANN with EVMs no higher than 1 % and an MAE no higher than 17, 26 for the SPR excited different type of signals such QAM (quadrature amplitude modulation) and LTE (Long Term Evolution).

Keywords: six-port based receiver; calibration, nonlinearity, memory effect, artificial neural network

Procedia PDF Downloads 54
17395 The Effect of Climatic and Cultural Conditions in Increasing the Sense of Community in Residential Complexes (Case Study: Saedyeh Residential Complex)

Authors: Razieh Esfandiarisedgh

Abstract:

Community architecture has been proposed as an alternative approach in architecture, with three political, sociological, and psychological approaches. In community architecture, the psychological approach, as the only approach related to community design, has an important index called a sense of community. Changes in today's modern society, such as the shrinking of families, cause a decrease in the sense of community and unwillingness of people. It has become a residential complex to be present in public spaces. This issue can be increased by creating motivation with the help of design for the presence and participation of people in public spaces and taking advantage of the facilities and quality of these spaces. This research used the qualitative research method, studied and collected information, and used observation and interviews in the selected sample. Through targeted sampling and matching it with the extracted design table, it was concluded that climate and culture are known as two important factors in the collective view of housing in Hamedan.

Keywords: community architecture, sense of community, environmental psychology, architecture

Procedia PDF Downloads 41
17394 Enhancing Learning for Research Higher Degree Students

Authors: Jenny Hall, Alison Jaquet

Abstract:

Universities’ push toward the production of high quality research is not limited to academic staff and experienced researchers. In this environment of research rich agendas, Higher Degree Research (HDR) students are increasingly expected to engage in the publishing of good quality papers in high impact journals. IFN001: Advanced Information Research Skills (AIRS) is a credit bearing mandatory coursework requirement for Queensland University of Technology (QUT) doctorates. Since its inception in 1989, this unique blended learning program has provided the foundations for new researchers to produce original and innovative research. AIRS was redeveloped in 2012, and has now been evaluated with reference to the university’s strategic research priorities. Our research is the first comprehensive evaluation of the program from the learner perspective. We measured whether the program develops essential transferrable skills and graduate capabilities to ensure best practice in the areas of publishing and data management. In particular, we explored whether AIRS prepares students to be agile researchers with the skills to adapt to different research contexts both within and outside academia. The target group for our study consisted of HDR students and supervisors at QUT. Both quantitative and qualitative research methods were used for data collection. Gathering data was by survey and focus groups with qualitative responses analyzed using NVivo. The results of the survey show that 82% of students surveyed believe that AIRS assisted their research process and helped them learn skills they need as a researcher. The 18% of respondents who expressed reservation about the benefits of AIRS were also examined to determine the key areas of concern. These included trends related to the timing of the program early in the candidature and a belief among some students that their previous research experience was sufficient for postgraduate study. New insights have been gained into how to better support HDR learners in partnership with supervisors and how to enhance learning experiences of specific cohorts, including international students and mature learners.

Keywords: data management, enhancing learning experience, publishing, research higher degree students, doctoral students

Procedia PDF Downloads 265
17393 The Influence of Audio on Perceived Quality of Segmentation

Authors: Silvio Ricardo Rodrigues Sanches, Bianca Cogo Barbosa, Beatriz Regina Brum, Cléber Gimenez Corrêa

Abstract:

To evaluate the quality of a segmentation algorithm, the authors use subjective or objective metrics. Although subjective metrics are more accurate than objective ones, objective metrics do not require user feedback to test an algorithm. Objective metrics require subjective experiments only during their development. Subjective experiments typically display to users some videos (generated from frames with segmentation errors) that simulate the environment of an application domain. This user feedback is crucial information for metric definition. In the subjective experiments applied to develop some state-of-the-art metrics used to test segmentation algorithms, the videos displayed during the experiments did not contain audio. Audio is an essential component in applications such as videoconference and augmented reality. If the audio influences the user’s perception, using only videos without audio in subjective experiments can compromise the efficiency of an objective metric generated using data from these experiments. This work aims to identify if the audio influences the user’s perception of segmentation quality in background substitution applications with audio. The proposed approach used a subjective method based on formal video quality assessment methods. The results showed that audio influences the quality of segmentation perceived by a user.

Keywords: background substitution, influence of audio, segmentation evaluation, segmentation quality

Procedia PDF Downloads 103
17392 To Access the Knowledge, Awareness and Factors Associated With Diabetes Mellitus in Buea, Cameroon

Authors: Franck kem Acho

Abstract:

This is a chronic metabolic disorder which is a fast-growing global problem with a huge social, health, and economic consequences. It is estimated that in 2010 there were globally 285 million people (approximately 6.4% of the adult population) suffering from this disease. This number is estimated to increase to 430 million in the absence of better control or cure. An ageing population and obesity are two main reasons for the increase. Diabetes mellitus is a chronic heterogeneous metabolic disorder with a complex pathogenesis. It is characterized by elevated blood glucose levels or hyperglycemia, which results from abnormalities in either insulin secretion or insulin action or both. Hyperglycemia manifests in various forms with a varied presentation and results in carbohydrate, fat, and protein metabolic dysfunctions. Long-term hyperglycemia often leads to various microvascular and macrovascular diabetic complications, which are mainly responsible for diabetes-associated morbidity and mortality. Hyperglycemia serves as the primary biomarker for the diagnosis of diabetes as well. Furthermore, it has been shown that almost 50% of the putative diabetics are not diagnosed until 10 years after onset of the disease, hence the real prevalence of global diabetes must be astronomically high. This study was conducted in a locality to access the level of knowledge, awareness and risk factors associated with people leaving with diabetes mellitus. A month before the screening was to be conducted, a health screening in some selected churches and on the local community radio as well as on relevant WhatsApp groups were advertised. A general health talk was delivered by the head of the screening unit to all attendees who were all educated on the procedure to be carried out with benefits and any possible discomforts after which the attendee’s consent was obtained. Evaluation of the participants for any leads to the diabetes selected for the screening was done by taking adequate history and physical examinations such as excessive thirst, increased urination, tiredness, hunger, unexplained weight loss, feeling irritable or having other mood changes, having blurry vision, having slow-healing sores, getting a lot of infections, such as gum, skin and vaginal infections. Out of the 94 participants the finding show that 78 were females and 16 were males, 70.21% of participants with diabetes were between the ages of 60-69yrs.The study found that only 10.63% of respondents declared a good level of knowledge of diabetes. Out of 3 symptoms of diabetes analyzed in this study, high blood sugar (58.5%) and chronic fatigue (36.17%) were the most recognized. Out of 4 diabetes risk factors analyzed in this study, obesity (21.27%) and unhealthy diet (60.63%) were the most recognized diabetes risk factors, while only 10.6% of respondents indicated tobacco use. The diabetic foot was the most recognized diabetes complication (50.57%), but some the participants indicated vision problems (30.8%),or cardiovascular diseases (20.21%) as diabetes complications.

Keywords: diabetes mellitus, non comunicable disease, general health talk, hyperglycemia

Procedia PDF Downloads 44
17391 Improved Feature Extraction Technique for Handling Occlusion in Automatic Facial Expression Recognition

Authors: Khadijat T. Bamigbade, Olufade F. W. Onifade

Abstract:

The field of automatic facial expression analysis has been an active research area in the last two decades. Its vast applicability in various domains has drawn so much attention into developing techniques and dataset that mirror real life scenarios. Many techniques such as Local Binary Patterns and its variants (CLBP, LBP-TOP) and lately, deep learning techniques, have been used for facial expression recognition. However, the problem of occlusion has not been sufficiently handled, making their results not applicable in real life situations. This paper develops a simple, yet highly efficient method tagged Local Binary Pattern-Histogram of Gradient (LBP-HOG) with occlusion detection in face image, using a multi-class SVM for Action Unit and in turn expression recognition. Our method was evaluated on three publicly available datasets which are JAFFE, CK, SFEW. Experimental results showed that our approach performed considerably well when compared with state-of-the-art algorithms and gave insight to occlusion detection as a key step to handling expression in wild.

Keywords: automatic facial expression analysis, local binary pattern, LBP-HOG, occlusion detection

Procedia PDF Downloads 156
17390 Examples of Parameterization of Stabilizing Controllers with One-Side Coprime Factorization

Authors: Kazuyoshi Mori

Abstract:

Examples of parameterization of stabilizing controllers that require only one of right-/left-coprime factorizations are presented. One parameterization method requires one side coprime factorization. The other requires no coprime factorization. The methods are based on the factorization approach so that a number of models can be applied the method we use in this paper.

Keywords: parametrization, coprime factorization, factorization approach, linear systems

Procedia PDF Downloads 359
17389 Developing of Attitude towards Using Complementary Treatments Scale in Turkey

Authors: Ayşegül Bilge, Merve Uğuryol, Şeyda Dülgerler, Mustafa Yıldız

Abstract:

The purpose of this research is to prove the Attitude towards Using Complementary Treatments Scale reliability and validity. The research is a methodological type of research that has been planned to determine the validity and reliability of the Attitude towards Using Complementary Treatments Scale. The scale has been developed by the researchers. In the scale, there are 23 questions including complementary and modern therapies individuals apply when they have health problems 4-item Likert-type evaluation has been carried out in preparing the questionnaire. High score obtained from the scale indicates a positive attitude towards complementary therapies. In the course of validity assessment of the scale, expert opinion has been received, and the content validity of the scale has been determined by using Kendall coefficient correlation test (Wa=0.200, p = 0.460). In the course of the reliability assessment of the scale, total score correlations of 23 materials have been examined, and those under 0.20 correlation limit has been removed from the scale correlation. As a result, the scale was left to be 13 items. In the internal consistency tests of the analyses, Cronbach's alpha value has been found to be 0.79. As a result, of the validity analyses of the Attitude towards Using Complementary Treatments Scale, the content and language validity analyses has been found to be at the expected level. It has been determined to be a highly reliable scale as the result of the reliability analyses. In conclusion, Attitude towards Using Complementary Treatments Scale is a valid and reliable scale.

Keywords: alternative health care, complementary treatment, instrument development, nursing practice

Procedia PDF Downloads 384
17388 Evaluation of Earthquake Induced Cost for Mid-Rise Buildings

Authors: Gulsah Olgun, Ozgur Bozdag, Yildirim Ertutar

Abstract:

This paper mainly focuses on performance assessment of buildings by associating the damage level with the damage cost. For this purpose a methodology is explained and applied to the representative mid-rise concrete building residing in Izmir. In order to consider uncertainties in occurrence of earthquakes, the structural analyses are conducted for all possible earthquakes in the region through the hazard curve. By means of the analyses, probability of the structural response being in different limit states are obtained and used to calculate expected damage cost. The expected damage cost comprises diverse cost components related to earthquake such as cost of casualties, replacement or repair cost of building etc. In this study, inter-story drift is used as an effective response variable to associate expected damage cost with different damage levels. The structural analysis methods performed to obtain inter story drifts are response spectrum method as a linear one, accurate push-over and time history methods to demonstrate the nonlinear effects on loss estimation. Comparison of the results indicates that each method provides similar values of expected damage cost. To sum up, this paper explains an approach which enables to minimize the expected damage cost of buildings and relate performance level to damage cost.

Keywords: expected damage cost, limit states, loss estimation, performance based design

Procedia PDF Downloads 254