Search results for: academic learning stress
627 Establishing Correlation between Urban Heat Island and Urban Greenery Distribution by Means of Remote Sensing and Statistics Data to Prioritize Revegetation in Yerevan
Authors: Linara Salikhova, Elmira Nizamova, Aleksandra Katasonova, Gleb Vitkov, Olga Sarapulova.
Abstract:
While most European cities conduct research on heat-related risks, there is a research gap in the Caucasus region, particularly in Yerevan, Armenia. This study aims to test the method of establishing a correlation between urban heat islands (UHI) and urban greenery distribution for prioritization of heat-vulnerable areas for revegetation. Armenia has failed to consider measures to mitigate UHI in urban development strategies despite a 2.1°C increase in average annual temperature over the past 32 years. However, planting vegetation in the city is commonly used to deal with air pollution and can be effective in reducing UHI if it prioritizes heat-vulnerable areas. The research focuses on establishing such priorities while considering the distribution of urban greenery across the city. The lack of spatially explicit air temperature data necessitated the use of satellite images to achieve the following objectives: (1) identification of land surface temperatures (LST) and quantification of temperature variations across districts; (2) classification of massifs of land surface types using normalized difference vegetation index (NDVI); (3) correlation of land surface classes with LST. Examination of the heat-vulnerable city areas (in this study, the proportion of individuals aged 75 years and above) is based on demographic data (Census 2011). Based on satellite images (Sentinel-2) captured on June 5, 2021, NDVI calculations were conducted. The massifs of the land surface were divided into five surface classes. Due to capacity limitations, the average LST for each district was identified using one satellite image from Landsat-8 on August 15, 2021. In this research, local relief is not considered, as the study mainly focuses on the interconnection between temperatures and green massifs. The average temperature in the city is 3.8°C higher than in the surrounding non-urban areas. The temperature excess ranges from a low in Norq Marash to a high in Nubarashen. Norq Marash and Avan have the highest tree and grass coverage proportions, with 56.2% and 54.5%, respectively. In other districts, the balance of wastelands and buildings is three times higher than the grass and trees, ranging from 49.8% in Quanaqer-Zeytun to 76.6% in Nubarashen. Studies have shown that decreased tree and grass coverage within a district correlates with a higher temperature increase. The temperature excess is highest in Erebuni, Ajapnyak, and Nubarashen districts. These districts have less than 25% of their area covered with grass and trees. On the other hand, Avan and Norq Marash districts have a lower temperature difference, as more than 50% of their areas are covered with trees and grass. According to the findings, a significant proportion of the elderly population (35%) aged 75 years and above reside in the Erebuni, Ajapnyak, and Shengavit neighborhoods, which are more susceptible to heat stress with an LST higher than in other city districts. The findings suggest that the method of comparing the distribution of green massifs and LST can contribute to the prioritization of heat-vulnerable city areas for revegetation. The method can become a rationale for the formation of an urban greening program.Keywords: heat-vulnerability, land surface temperature, urban greenery, urban heat island, vegetation
Procedia PDF Downloads 72626 Protected Cultivation of Horticultural Crops: Increases Productivity per Unit of Area and Time
Authors: Deepak Loura
Abstract:
The most contemporary method of producing horticulture crops both qualitatively and quantitatively is protected cultivation, or greenhouse cultivation, which has gained widespread acceptance in recent decades. Protected farming, commonly referred to as controlled environment agriculture (CEA), is extremely productive, land- and water-wise, as well as environmentally friendly. The technology entails growing horticulture crops in a controlled environment where variables such as temperature, humidity, light, soil, water, fertilizer, etc. are adjusted to achieve optimal output and enable a consistent supply of them even during the off-season. Over the past ten years, protected cultivation of high-value crops and cut flowers has demonstrated remarkable potential. More and more agricultural and horticultural crop production systems are moving to protected environments as a result of the growing demand for high-quality products by global markets. By covering the crop, it is possible to control the macro- and microenvironments, enhancing plant performance and allowing for longer production times, earlier harvests, and higher yields of higher quality. These shielding features alter the environment of the plant while also offering protection from wind, rain, and insects. Protected farming opens up hitherto unexplored opportunities in agriculture as the liberalised economy and improved agricultural technologies advance. Typically, the revenues from fruit, vegetable, and flower crops are 4 to 8 times higher than those from other crops. If any of these high-value crops are cultivated in protected environments like greenhouses, net houses, tunnels, etc., this profit can be multiplied. Vegetable and cut flower post-harvest losses are extremely high (20–0%), however sheltered growing techniques and year-round cropping can greatly minimize post-harvest losses and enhance yield by 5–10 times. Seasonality and weather have a big impact on the production of vegetables and flowers. The variety of their products results in significant price and quality changes for vegetables. For the application of current technology in crop production, achieving a balance between year-round availability of vegetables and flowers with minimal environmental impact and remaining competitive is a significant problem. The future of agriculture will be protected since population growth is reducing the amount of land that may be held. Protected agriculture is a particularly profitable endeavor for tiny landholdings. Small greenhouses, net houses, nurseries, and low tunnel greenhouses can all be built by farmers to increase their income. Protected agriculture is also aided by the rise in biotic and abiotic stress factors. As a result of the greater productivity levels, these technologies are not only opening up opportunities for producers with larger landholdings, but also for those with smaller holdings. Protected cultivation can be thought of as a kind of precise, forward-thinking, parallel agriculture that covers almost all aspects of farming and is rather subject to additional inspection for technical applicability to circumstances, farmer economics, and market economics.Keywords: protected cultivation, horticulture, greenhouse, vegetable, controlled environment agriculture
Procedia PDF Downloads 76625 Rethinking Entrepreneurship Education as a Remedy for Graduates Unemployment in Nigeria
Authors: Chinwe Susan Oguejiofor, Daniel Osamwonyi Iyioha
Abstract:
Over the last two decades, Nigeria has witnessed an upsurge in graduate unemployment occasioned by the lack of industries and proliferation of tertiary institutions churning out thousands of graduates every year to compete for the few available job space. The astronomical rise in the unemployment rate amongst Nigerian graduates however, is principally assumed to be the defective curricula of the universities and other tertiary institutions whose focus is on training for white-collar jobs. Although graduate unemployment has become a global scourge, its adverse economic impact is believed to be more in developing economies like Nigeria with a huge young population within the working age who cannot seem to find gainful employment to make out a respectable livelihood. Thus, higher institutions especially Universities found itself under pressure and intense competition to produce graduates who can think outside the box and create jobs; hence there was the need to focus on instilling hands-on practical job skills into their students that will make them job creators rather than job seekers on graduation. In the same vein stakeholders in education have continued to lend their voices to the philosophy that the undergraduate curricula should be completely overhauled to accomodate the development of hand-on practical skills and innovative capacity relevant to creating solutions to societal problems. In a bid to correct this anomaly, the Federal Government of Nigeria in conjunction with the Ministry of Commerce, Industry and Investment inaugurated a programme tagged “University Entrepreneurship Development Programme” (UNEDEP) whose objective was basically to promote self-employment among the youth right from the institutions of higher learning. But the question is whether the objectives of the programme have actually been achieved. Despite the inclusion in Nigerian educational curriculum close to two decades now,, one wonder if the essence has been aborted. Thus, the paper focused on the concept of entrepreneurship education, objectives of entrepreneurship education, Graduates unemployment, rethinking entrepreneurship education programme in tertiary institution for employment generation , role of entrepreneurship in job creation, challenges of entrepreneurship education in tertiary institution in Nigeria, conclusion and recommendations were drawn accordingly.Keywords: rethinking, entrepreneurship education, remedy, unemployment, job creation
Procedia PDF Downloads 79624 Improving Cell Type Identification of Single Cell Data by Iterative Graph-Based Noise Filtering
Authors: Annika Stechemesser, Rachel Pounds, Emma Lucas, Chris Dawson, Julia Lipecki, Pavle Vrljicak, Jan Brosens, Sean Kehoe, Jason Yap, Lawrence Young, Sascha Ott
Abstract:
Advances in technology make it now possible to retrieve the genetic information of thousands of single cancerous cells. One of the key challenges in single cell analysis of cancerous tissue is to determine the number of different cell types and their characteristic genes within the sample to better understand the tumors and their reaction to different treatments. For this analysis to be possible, it is crucial to filter out background noise as it can severely blur the downstream analysis and give misleading results. In-depth analysis of the state-of-the-art filtering methods for single cell data showed that they do, in some cases, not separate noisy and normal cells sufficiently. We introduced an algorithm that filters and clusters single cell data simultaneously without relying on certain genes or thresholds chosen by eye. It detects communities in a Shared Nearest Neighbor similarity network, which captures the similarities and dissimilarities of the cells by optimizing the modularity and then identifies and removes vertices with a weak clustering belonging. This strategy is based on the fact that noisy data instances are very likely to be similar to true cell types but do not match any of these wells. Once the clustering is complete, we apply a set of evaluation metrics on the cluster level and accept or reject clusters based on the outcome. The performance of our algorithm was tested on three datasets and led to convincing results. We were able to replicate the results on a Peripheral Blood Mononuclear Cells dataset. Furthermore, we applied the algorithm to two samples of ovarian cancer from the same patient before and after chemotherapy. Comparing the standard approach to our algorithm, we found a hidden cell type in the ovarian postchemotherapy data with interesting marker genes that are potentially relevant for medical research.Keywords: cancer research, graph theory, machine learning, single cell analysis
Procedia PDF Downloads 112623 Developing Medical Leaders: A Realistic Evaluation Study for Improving Patient Safety and Maximising Medical Engagement
Authors: Lisa Fox, Jill Aylott
Abstract:
There is a global need to identify ways to engage doctors in non-clinical matters such as medical leadership, service improvement and health system transformation. Using the core principles of Realistic Evaluation (RE), this study examined what works, for doctors of different grades, specialities and experience in an acute NHS Hospital Trust in the UK. Realistic Evaluation is an alternative to more traditional cause and effect evaluation models and seeks to understand the interdependencies of Context, Mechanism and Outcome proposing that Context (C) + Mechanism (M) = Outcome (O). In this study, the context, mechanism and outcome were examined from within individual medical leaders to determine what enables levels of medical engagement in a specific improvement project to reduce hospital inpatient mortality. Five qualitative case studies were undertaken with consultants who had regularly completed mortality reviews over a six month period. The case studies involved semi-structured interviews to test the theory behind the drivers for medical engagement. The interviews were analysed using a theory-driven thematic analysis to identify CMO configurations to explain what works, for whom and in what circumstances. The findings showed that consultants with a longer length of service became more engaged if there were opportunities to be involved in the beginning of an improvement project, with more opportunities to affect the design. Those that are new to a consultant role were more engaged if they felt able to apply any learning directly into their own settings or if they could use it as an opportunity to understand more about the organisation they are working in. This study concludes that RE is a useful methodology for better understanding the complexities of motivation and consultant engagement in a trust wide service improvement project. The study showed that there should be differentiated and bespoke training programmes to maximise each individual doctor’s propensity for medical engagement. The RE identified that there are different ways to ensure that doctors have the right skills to feel confident in service improvement projects.Keywords: realistic evaluation, medical leadership, medical engagement, patient safety, service improvement
Procedia PDF Downloads 216622 Backward-Facing Step Measurements at Different Reynolds Numbers Using Acoustic Doppler Velocimetry
Authors: Maria Amelia V. C. Araujo, Billy J. Araujo, Brian Greenwood
Abstract:
The flow over a backward-facing step is characterized by the presence of flow separation, recirculation and reattachment, for a simple geometry. This type of fluid behaviour takes place in many practical engineering applications, hence the reason for being investigated. Historically, fluid flows over a backward-facing step have been examined in many experiments using a variety of measuring techniques such as laser Doppler velocimetry (LDV), hot-wire anemometry, particle image velocimetry or hot-film sensors. However, some of these techniques cannot conveniently be used in separated flows or are too complicated and expensive. In this work, the applicability of the acoustic Doppler velocimetry (ADV) technique is investigated to such type of flows, at various Reynolds numbers corresponding to different flow regimes. The use of this measuring technique in separated flows is very difficult to find in literature. Besides, most of the situations where the Reynolds number effect is evaluated in separated flows are in numerical modelling. The ADV technique has the advantage in providing nearly non-invasive measurements, which is important in resolving turbulence. The ADV Nortek Vectrino+ was used to characterize the flow, in a recirculating laboratory flume, at various Reynolds Numbers (Reh = 3738, 5452, 7908 and 17388) based on the step height (h), in order to capture different flow regimes, and the results compared to those obtained using other measuring techniques. To compare results with other researchers, the step height, expansion ratio and the positions upstream and downstream the step were reproduced. The post-processing of the AVD records was performed using a customized numerical code, which implements several filtering techniques. Subsequently, the Vectrino noise level was evaluated by computing the power spectral density for the stream-wise horizontal velocity component. The normalized mean stream-wise velocity profiles, skin-friction coefficients and reattachment lengths were obtained for each Reh. Turbulent kinetic energy, Reynolds shear stresses and normal Reynolds stresses were determined for Reh = 7908. An uncertainty analysis was carried out, for the measured variables, using the moving block bootstrap technique. Low noise levels were obtained after implementing the post-processing techniques, showing their effectiveness. Besides, the errors obtained in the uncertainty analysis were relatively low, in general. For Reh = 7908, the normalized mean stream-wise velocity and turbulence profiles were compared directly with those acquired by other researchers using the LDV technique and a good agreement was found. The ADV technique proved to be able to characterize the flow properly over a backward-facing step, although additional caution should be taken for measurements very close to the bottom. The ADV measurements showed reliable results regarding: a) the stream-wise velocity profiles; b) the turbulent shear stress; c) the reattachment length; d) the identification of the transition from transitional to turbulent flows. Despite being a relatively inexpensive technique, acoustic Doppler velocimetry can be used with confidence in separated flows and thus very useful for numerical model validation. However, it is very important to perform adequate post-processing of the acquired data, to obtain low noise levels, thus decreasing the uncertainty.Keywords: ADV, experimental data, multiple Reynolds number, post-processing
Procedia PDF Downloads 147621 Contextual SenSe Model: Word Sense Disambiguation using Sense and Sense Value of Context Surrounding the Target
Authors: Vishal Raj, Noorhan Abbas
Abstract:
Ambiguity in NLP (Natural language processing) refers to the ability of a word, phrase, sentence, or text to have multiple meanings. This results in various kinds of ambiguities such as lexical, syntactic, semantic, anaphoric and referential am-biguities. This study is focused mainly on solving the issue of Lexical ambiguity. Word Sense Disambiguation (WSD) is an NLP technique that aims to resolve lexical ambiguity by determining the correct meaning of a word within a given context. Most WSD solutions rely on words for training and testing, but we have used lemma and Part of Speech (POS) tokens of words for training and testing. Lemma adds generality and POS adds properties of word into token. We have designed a novel method to create an affinity matrix to calculate the affinity be-tween any pair of lemma_POS (a token where lemma and POS of word are joined by underscore) of given training set. Additionally, we have devised an al-gorithm to create the sense clusters of tokens using affinity matrix under hierar-chy of POS of lemma. Furthermore, three different mechanisms to predict the sense of target word using the affinity/similarity value are devised. Each contex-tual token contributes to the sense of target word with some value and whichever sense gets higher value becomes the sense of target word. So, contextual tokens play a key role in creating sense clusters and predicting the sense of target word, hence, the model is named Contextual SenSe Model (CSM). CSM exhibits a noteworthy simplicity and explication lucidity in contrast to contemporary deep learning models characterized by intricacy, time-intensive processes, and chal-lenging explication. CSM is trained on SemCor training data and evaluated on SemEval test dataset. The results indicate that despite the naivety of the method, it achieves promising results when compared to the Most Frequent Sense (MFS) model.Keywords: word sense disambiguation (wsd), contextual sense model (csm), most frequent sense (mfs), part of speech (pos), natural language processing (nlp), oov (out of vocabulary), lemma_pos (a token where lemma and pos of word are joined by underscore), information retrieval (ir), machine translation (mt)
Procedia PDF Downloads 107620 An Exploratory Study of Preschool English Education in China
Authors: Xuan Li
Abstract:
The English language occupies a crucial position in the Chinese educational system and is officially introduced in the school curriculum from the third year of primary school onward. However, it is worth noting that along with the movement to remove primary-oriented education from preschools, the teaching of English is banned in preschools. Considering the worldwide trend of learning English at a young age, whether this ban can be implemented successfully is doubtful. With an initial focus on the interaction of language-in-education planning and policy (LEPP) at the macro level and actual practice at the micro level, this research selected three private preschools and two public preschools to explore what is taking place in terms of English education. All data collected is qualitative and is gained from documentary analysis, school observation, interviews, and focus groups. The findings show that: (1) although the English ban in preschool education aims to regulate all types of preschools and all adult Chinese participants are aware of this ban, there are very different scenarios according to type of preschool, such that no English classes are found in public schools while private preschools commonly provide some kind of English education; (2) even public schools do not have an English-free environment and parents’ demand for English education is high; (3) there is an obvious top-down hierarchy in both public and private schools, in which administrators make the decisions while others have little power to influence the school curriculum; (4) there is a clear gap in the perception of English teaching between children and adults, in which adults prefer foreign English teachers and think English teaching is just playing, while children do not have a clear preference regarding teachers and do not think English class is just for fun; (5) without macro support, there are many challenges involved in preschool English education, including the shortage of qualified teachers and teaching resources, ineffective personnel management and few opportunities for speaking English in daily life. Hopefully, this research will not only highlight the interaction of LEPP at different levels and the importance of individual agency but also raise the awareness of how to provide qualified and equal education for all children.Keywords: individual agency, language-in-education planning and policy, micro context, preschool English education
Procedia PDF Downloads 151619 Organisational Mindfulness Case Study: A 6-Week Corporate Mindfulness Programme Significantly Enhances Organisational Well-Being
Authors: Dana Zelicha
Abstract:
A 6-week mindfulness programme was launched to improve the well being and performance of 20 managers (including the supervisor) of an international corporation in London. A unique assessment methodology was customised to the organisation’s needs, measuring four parameters: prioritising skills, listening skills, mindfulness levels and happiness levels. All parameters showed significant improvements (p < 0.01) post intervention, with a remarkable increase in listening skills and mindfulness levels. Although corporate mindfulness programmes have proven to be effective, the challenge remains the low engagement levels at home and the implementation of these tools beyond the scope of the intervention. This study has offered an innovative approach to enforce home engagement levels, which yielded promising results. The programme launched with a 2-day introduction intervention, which was followed by a 6-week training course (1 day a week; 2 hours each). Participants learned all basic principles of mindfulness such as mindfulness meditations, Mindfulness Based Stress Reduction (MBSR) techniques and Mindfulness Based Cognitive Therapy (MBCT) practices to incorporate into their professional and personal lives. The programme contained experiential mindfulness meditations and innovative mindfulness tools (OWBA-MT) created by OWBA - The Well Being Agency. Exercises included Mindful Meetings, Unitasking and Mindful Feedback. All sessions concluded with guided discussions and group reflections. One fundamental element of this programme was engagement level outside of the workshop. In the office, participants connected with a mindfulness buddy - a team member in the group with whom they could find support throughout the programme. At home, participants completed online daily mindfulness forms that varied according to weekly themes. These customised forms gave participants the opportunity to reflect on whether they made time for daily mindfulness practice, and to facilitate a sense of continuity and responsibility. At the end of the programme, the most engaged team member was crowned the ‘mindful maven’ and received a special gift. The four parameters were measured using online self-reported questionnaires, including the Listening Skills Inventory (LSI), Mindfulness Attention Awareness Scale (MAAS), Time Management Behaviour Scale (TMBS) and a modified version of the Oxford Happiness Questionnaire (OHQ). Pre-intervention questionnaires were collected at the start of the programme, and post-intervention data was collected 4-weeks following completion. Quantitative analysis using paired T-tests of means showed significant improvements, with a 23% increase in listening skills, a 22% improvement in mindfulness levels, a 12% increase in prioritising skills, and an 11% improvement in happiness levels. Participant testimonials exhibited high levels of satisfaction and the overall results indicate that the mindfulness programme substantially impacted the team. These results suggest that 6-week mindfulness programmes can improve employees’ capacities to listen and work well with others, to effectively manage time and to experience enhanced satisfaction both at work and in life. Limitations noteworthy to consider include the afterglow effect and lack of generalisability, as this study was conducted on a small and fairly homogenous sample.Keywords: corporate mindfulness, listening skills, organisational well being, prioritising skills, mindful leadership
Procedia PDF Downloads 270618 The Effectiveness of Guest Lecturers with Disabilities in the Classroom
Authors: Afshin Gharib
Abstract:
Often, instructors prefer to bring into class a guest lecturer who can provide an “experiential” perspective on a particular topic. The assumption is that the personal experience brought into the classroom makes the material resonate more with students and that students would have a preference for material being taught from an experiential perspective. The question we asked in the present study was whether a guest lecture from an “experiential” expert with a disability (e.g. a guest suffering from cone-rod dystrophy lecturing on vision, or a dyslexic lecturing on the psychology of reading) would be more effective than the course instructor in capturing students attention and conveying information in an Introduction to Psychology class. Students in two sections of Introduction to Psychology (N = 25 in each section) listened to guest lecturers with disabilities lecturing on a topic related to their disability, one in the area of Sensation and Perception (the guest lecturer is vision impaired) and one in the area of Language Development (the guest lecturer is dyslexic). The Guest lecturers lectured on the same topic in both sections, however, each lecturer used their own experiences to highlight the topics they cover in one section but not the other (counterbalanced between sections), providing students in one section with experiential testimony. Following each of the 4 lectures (two experiential, two non-experiential) students rated the lecture on several dimensions including overall quality, level of engagement, and performance. In addition, students in both sections were tested on the same test items from the lecture material to ascertain degree of learning, and given identical “pop” quizzes two weeks after the exam to measure retention. It was hypothesized that students would find the experiential lectures from lecturers talking about their disabilities more engaging, learn more from them, and retain the material for longer. We found that students in fact preferred the course instructor to the guests, regardless of whether the guests included a discussion of their own disability in their lectures. Performance on the exam questions and the pop quiz items were not different between “experiential” and “non-experiential” lectures, suggesting that guest lecturers who discuss their own disabilities in lecture are not more effective in conveying material and students are not more likely to retain material delivered by “experiential” guests. In future research we hope to explore the reasons for students preference for their regular instructor over guest lecturers.Keywords: guest lecturer, student perception, retention, experiential
Procedia PDF Downloads 17617 Hand Gesture Detection via EmguCV Canny Pruning
Authors: N. N. Mosola, S. J. Molete, L. S. Masoebe, M. Letsae
Abstract:
Hand gesture recognition is a technique used to locate, detect, and recognize a hand gesture. Detection and recognition are concepts of Artificial Intelligence (AI). AI concepts are applicable in Human Computer Interaction (HCI), Expert systems (ES), etc. Hand gesture recognition can be used in sign language interpretation. Sign language is a visual communication tool. This tool is used mostly by deaf societies and those with speech disorder. Communication barriers exist when societies with speech disorder interact with others. This research aims to build a hand recognition system for Lesotho’s Sesotho and English language interpretation. The system will help to bridge the communication problems encountered by the mentioned societies. The system has various processing modules. The modules consist of a hand detection engine, image processing engine, feature extraction, and sign recognition. Detection is a process of identifying an object. The proposed system uses Canny pruning Haar and Haarcascade detection algorithms. Canny pruning implements the Canny edge detection. This is an optimal image processing algorithm. It is used to detect edges of an object. The system employs a skin detection algorithm. The skin detection performs background subtraction, computes the convex hull, and the centroid to assist in the detection process. Recognition is a process of gesture classification. Template matching classifies each hand gesture in real-time. The system was tested using various experiments. The results obtained show that time, distance, and light are factors that affect the rate of detection and ultimately recognition. Detection rate is directly proportional to the distance of the hand from the camera. Different lighting conditions were considered. The more the light intensity, the faster the detection rate. Based on the results obtained from this research, the applied methodologies are efficient and provide a plausible solution towards a light-weight, inexpensive system which can be used for sign language interpretation.Keywords: canny pruning, hand recognition, machine learning, skin tracking
Procedia PDF Downloads 185616 Oil and Proteins of Sardine (Sardina Pilchardus) Compared with Casein or Mixture of Vegetable Oils Improves Dyslipidemia and Reduces Inflammation and Oxidative Stress in Hypercholesterolemic and Obese Rats
Authors: Khelladi Hadj Mostefa, Krouf Djamil, Taleb-Dida Nawel
Abstract:
Background: Obesity results from a prolonged imbalance between energy intake and energy expenditure, as depending on basal metabolic rate. Oils and proteins from sea have important therapeutic (such as obesity and hypercholesterolemia) and antioxidant effects. Sardine are a widely consumed fish in the Mediterranean region. Its consumption provides humans with various nutrients such as oils (rich in omega 3 plyunsaturated fatty acids)) and proteins. Methods: Sardine oil (SO) and sardine proteins (SP) were extracted and purified. Mixture of vegetable oils (olive-walnut-sunflower) were prepared from oils produced in Algeria. Eighteen wistar rats are fed a high fat diet enriched with 1% cholesterol for 30 days to induce obesity and hypercholesterolemia. The rats are divided into 3 groups. The first group consumes 20% sardine protein combined with 5% sardine oil (38% SFA (saturated fatty acids), 31% MIFA (monounsaturated fatty acids) and 31% PIFA (polyunsaturated fatty acids)) (SPso). The second group consumes 20% sardine protein combined with 5% of a mixture of vegetable oils (VO) containing 13% SFA, 58% MIFA and 29% PIFA (PSvo), and the third group consuming 20% casein combined with 5% of the mixture of vegetable oils and serves as a semi-synthetic reference (CASvo). Body weights and glycaemia are measured weekly After 28 days of experimentation, the rats are sacrificed, the blood and the liver removed. Serum assays of total cholesterol (TC) and triglycerides (TG) were performed by enzymatic colorimetric methods. Evaluation of lipid peroxidation was performed by assaying thiobarbituric acid reactive species (TBARS) and hydroperoxides values. The protein oxidation was performed by assaying carbonyl derivatives values. Finally, evaluation of antioxidant defense is made by measuring the activity of antioxidant enzymes, the superoxide dismutase (SOD) and the catalase (CAT).Results: After 28 days, the body weight (BW) of the rats increased significantly in SPso and SPvo groups compared to CAS group, by +11% and 7%, respectively. Cholesterolemia (TC) increased significantly in the SPso and SPvo groups compared to the CAS group (P<0.01), while triglyceridemia (TG) decreased significantly in the SPso group compared to SPvo and CAS groups (P<0.01). Albumin (marker of inflammation) increased in the PSs group compared to SPvo and CAS groups by +35% and +13%, respectively. The serum TBARS levels are -40% lower in SPso group compared to SPvo group, and they are -80% and -76% lower in SPso compared to SPvo and CAS groups, respectively. The level of carbonyls derivatives in the serum and liver are significantly reduced in the SPso group compared to the SPvo and CAS groups. Superoxide dismutase (SOD) activity decreased in liver of SPso group compared to SPvo group (P<0.01). While that of CAT is increased in liver tissue of SPso group compared to SPvo group (P<0.01). Conclusion: Sardine oil combined with sardine protein has a hypotriglyceridemic effect, reduces body weight, attenuates inflammation and seems to protect against lipid peroxidation and protein oxidation and increases antioxidant defense in hypercholesterolemic and obese rats. This could be in favor of a protective effect against obesity and cardiovascular diseases.Keywords: rat, obesity, hypercholesterolemia, sardine protein, sardine oil, vegetable oils mixture, lipid peroxidation, protein oxidation, antioxidant defense
Procedia PDF Downloads 66615 Dream Work: Examining the Effectiveness of Dream Interpretation in Gaining Psychological Insight into Young Adults in Korea
Authors: Ahn Christine Myunghee, Sim Wonjin, Cho Kristina, Ahn Mira, Hong Yeju, Kwok Jihae, Lim Sooyeon, Park Hansol
Abstract:
With a sharp increase in the prevalence rate for mental health issues in Korea, there is a need for specific and effective intervention strategies in counseling and psychotherapy for use with Korean clients. With the cultural emphasis on restraining emotional expression and not disclosing personal and familial problems to outsiders, clients often find it difficult to discuss their emotional issues even to therapists. Exploring a client’s internal psychological processes bypassing this culture-specific mode of therapeutic communication often becomes a challenge in the therapeutic setting. Given this socio-cultural context, the purpose of the current study was to investigate the effectiveness of using dream work to individuals in Korea. The current study conducted one 60-90 minute dream session and analyzed the dream content of 39 Korean young adults to evaluate the effectiveness of the Hill dream model in accessing the intra-psychic materials, determining essential emotional themes, and learning how the individuals interpreted the contents of their dreams. The transcribed data, which included a total of 39 sessions from 39 volunteer university students, were analyzed by the Consensus Qualitative Research (CQR) approach in terms of domains and core ideas. Self-report measures on Dream Salience, Gains from Dream Interpretations and the Session Evaluation Scale were administered before and after each of their dream sessions. The results indicated that dream work appears to be an effective way to understand unconscious motivations, thoughts, and feelings related to a person’s sense of self, and also how these people relate to other people. Current findings need to be replicated with clients referred for counseling and psychotherapy to determine if the dream work is an appropriate and useful intervention in counseling settings. Limitations of the current study and suggestions for future follow-ups are included in the discussion.Keywords: dream work, dream interpretation, Korean, young adults, CQR
Procedia PDF Downloads 446614 Transcriptional Differences in B cell Subpopulations over the Course of Preclinical Autoimmunity Development
Authors: Aleksandra Bylinska, Samantha Slight-Webb, Kevin Thomas, Miles Smith, Susan Macwana, Nicolas Dominguez, Eliza Chakravarty, Joan T. Merrill, Judith A. James, Joel M. Guthridge
Abstract:
Background: Systemic Lupus Erythematosus (SLE) is an interferon-related autoimmune disease characterized by B cell dysfunction. One of the main hallmarks is a loss of tolerance to self-antigens leading to increased levels of autoantibodies against nuclear components (ANAs). However, up to 20% of healthy ANA+ individuals will not develop clinical illness. SLE is more prevalent among women and minority populations (African, Asian American and Hispanics). Moreover, African Americans have a stronger interferon (IFN) signature and develop more severe symptoms. The exact mechanisms involved in ethnicity-dependent B cell dysregulation and the progression of autoimmune disease from ANA+ healthy individuals to clinical disease remains unclear. Methods: Peripheral blood mononuclear cells (PBMCs) from African (AA) and European American (EA) ANA- (n=12), ANA+ (n=12) and SLE (n=12) individuals were assessed by multimodal scRNA-Seq/CITE-Seq methods to examine differential gene signatures in specific B cell subsets. Library preparation was done with a 10X Genomics Chromium according to established protocols and sequenced on Illumina NextSeq. The data were further analyzed for distinct cluster identification and differential gene signatures in the Seurat package in R and pathways analysis was performed using Ingenuity Pathways Analysis (IPA). Results: Comparing all subjects, 14 distinct B cell clusters were identified using a community detection algorithm and visualized with Uniform Manifold Approximation Projection (UMAP). The proportion of each of those clusters varied by disease status and ethnicity. Transitional B cells trended higher in ANA+ healthy individuals, especially in AA. Ribonucleoprotein high population (HNRNPH1 elevated, heterogeneous nuclear ribonucleoprotein, RNP-Hi) of proliferating Naïve B cells were more prevalent in SLE patients, specifically in EA. Interferon-induced protein high population (IFIT-Hi) of Naive B cells are increased in EA ANA- individuals. The proportion of memory B cells and plasma cells clusters tend to be expanded in SLE patients. As anticipated, we observed a higher signature of cytokine-related pathways, especially interferon, in SLE individuals. Pathway analysis among AA individuals revealed an NRF2-mediated Oxidative Stress response signature in the transitional B cell cluster, not seen in EA individuals. TNFR1/2 and Sirtuin Signaling pathway genes were higher in AA IFIT-Hi Naive B cells, whereas they were not detected in EA individuals. Interferon signaling was observed in B cells in both ethnicities. Oxidative phosphorylation was found in age-related B cells (ABCs) for both ethnicities, whereas Death Receptor Signaling was found only in EA patients in these cells. Interferon-related transcription factors were elevated in ABCs and IFIT-Hi Naive B cells in SLE subjects of both ethnicities. Conclusions: ANA+ healthy individuals have altered gene expression pathways in B cells that might drive apoptosis and subsequent clinical autoimmune pathogenesis. Increases in certain regulatory pathways may delay progression to SLE. Further, AA individuals have more elevated activation pathways that may make them more susceptible to SLE. Procedia PDF Downloads 175613 Co-Design of Accessible Speech Recognition for Users with Dysarthric Speech
Authors: Elizabeth Howarth, Dawn Green, Sean Connolly, Geena Vabulas, Sara Smolley
Abstract:
Through the EU Horizon 2020 Nuvoic Project, the project team recruited 70 individuals in the UK and Ireland to test the Voiceitt speech recognition app and provide user feedback to developers. The app is designed for people with dysarthric speech, to support communication with unfamiliar people and access to speech-driven technologies such as smart home equipment and smart assistants. Participants with atypical speech, due to a range of conditions such as cerebral palsy, acquired brain injury, Down syndrome, stroke and hearing impairment, were recruited, primarily through organisations supporting disabled people. Most had physical or learning disabilities in addition to dysarthric speech. The project team worked with individuals, their families and local support teams, to provide access to the app, including through additional assistive technologies where needed. Testing was user-led, with participants asked to identify and test use cases most relevant to their daily lives over a period of three months or more. Ongoing technical support and training were provided remotely and in-person throughout the testing period. Structured interviews were used to collect feedback on users' experiences, with delivery adapted to individuals' needs and preferences. Informal feedback was collected through ongoing contact between participants, their families and support teams and the project team. Focus groups were held to collect feedback on specific design proposals. User feedback shared with developers has led to improvements to the user interface and functionality, including faster voice training, simplified navigation, the introduction of gamification elements and of switch access as an alternative to touchscreen access, with other feature requests from users still in development. This work offers a case-study in successful and inclusive co-design with the disabled community.Keywords: co-design, assistive technology, dysarthria, inclusive speech recognition
Procedia PDF Downloads 110612 Teaching Non-Euclidean Geometries to Learn Euclidean One: An Experimental Study
Authors: Silvia Benvenuti, Alessandra Cardinali
Abstract:
In recent years, for instance, in relation to the Covid 19 pandemic and the evidence of climate change, it is becoming quite clear that the development of a young kid into an adult citizen requires a solid scientific background. Citizens are required to exert logical thinking and know the methods of science in order to adapt, understand, and develop as persons. Mathematics sits at the core of these required skills: learning the axiomatic method is fundamental to understand how hard sciences work and helps in consolidating logical thinking, which will be useful for the entire life of a student. At the same time, research shows that the axiomatic study of geometry is a problematic topic for students, even for those with interest in mathematics. With this in mind, the main goals of the research work we will describe are: (1) to show whether non-Euclidean geometries can be a tool to allow students to consolidate the knowledge of Euclidean geometries by developing it in a critical way; (2) to promote the understanding of the modern axiomatic method in geometry; (3) to give students a new perspective on mathematics so that they can see it as a creative activity and a widely discussed topic with a historical background. One of the main issues related to the state-of-the-art in this topic is the shortage of experimental studies with students. For this reason, our aim is to show further experimental evidence of the potential benefits of teaching non-Euclidean geometries at high school, based on data collected from a study started in 2005 in the frame of the Italian National Piano Lauree Scientifiche, continued by a teacher training organized in September 2018, perfected in a pilot study that involved 77 high school students during the school years 2018-2019 and 2019-2020. and finally implemented through an experimental study conducted in 2020-21 with 87 high school students. Our study shows that there is potential for further research to challenge current conceptions of the school mathematics curriculum and of the capabilities of high school mathematics students.Keywords: Non-Euclidean geometries, beliefs about mathematics, questionnaires, modern axiomatic method
Procedia PDF Downloads 75611 Effect of Blood Sugar Levels on Short Term and Working Memory Status in Type 2 Diabetics
Authors: Mythri G., Manjunath ML, Girish Babu M., Shireen Swaliha Quadri
Abstract:
Background: The increase in diabetes among the elderly is of concern because in addition to the wide range of traditional diabetes complications, evidence has been growing that diabetes is associated with increased risk of cognitive decline. Aims and Objectives: To find out if there is any association between blood sugar levels and short-term and working memory status in patients of type 2 diabetes. Materials and Methods: The study was carried out in 200 individuals aged between 40-65 years consisting of 100 diagnosed cases of Type 2 Diabetes Mellitus and 100 non-diabetics from OPD of Mc Gann Hospital, Shivamogga. Rye’s Auditory Verbal Learning Test, Verbal Fluency Test and Visual Reproduction Test, Working Digit Span Test and Validation Span Test were used to assess short-term and working memory. Fasting and Post Prandial blood sugar levels were estimated. Statistical analysis was done using SPSS 21. Results: Memory test scores of type 2 diabetics were significantly reduced (p < 0.001) when compared to the memory scores of age and gender matched non-diabetics. Fasting blood sugar levels were found to have a negative correlation with memory scores for all 5 tests: AVLT (r=-0.837), VFT (r=-0.888), VRT(r=-0.787), WDST (r=-0.795) and VST (r=-0.943). Post- Prandial blood sugar levels were found to have a negative correlation with memory scores for all 5 tests: AVLT (r=-0.922), VFT (r=-0.848), VRT(r=-0.707),WDST (r=-0.729) and VST (r=-0.880) Memory scores in all 5 tests were found to be negatively correlated with the FBS and PPBS levels in diabetic patients (p < 0.001). Conclusion: The decreased memory status in diabetic patients may be due to many factors like hyperglycemia, vascular disease, insulin resistance, amyloid deposition and also some of the factor combine to produce additive effects like, type of diabetes, co-morbidities, age of onset, duration of the disease and type of therapy. These observed effects of blood sugar levels of diabetics on memory status are of potential clinical importance because even mild cognitive impairment could interfere with todays’ activities.Keywords: diabetes, cognition, diabetes, HRV, respiratory medicine
Procedia PDF Downloads 282610 Design and Development of an Autonomous Beach Cleaning Vehicle
Authors: Mahdi Allaoua Seklab, Süleyman BaşTürk
Abstract:
In the quest to enhance coastal environmental health, this study introduces a fully autonomous beach cleaning machine, a breakthrough in leveraging green energy and advanced artificial intelligence for ecological preservation. Designed to operate independently, the machine is propelled by a solar-powered system, underscoring a commitment to sustainability and the use of renewable energy in autonomous robotics. The vehicle's autonomous navigation is achieved through a sophisticated integration of LIDAR and a camera system, utilizing an SSD MobileNet V2 object detection model for accurate and real-time trash identification. The SSD framework, renowned for its efficiency in detecting objects in various scenarios, is coupled with the lightweight and precise highly MobileNet V2 architecture, making it particularly suited for the computational constraints of on-board processing in mobile robotics. Training of the SSD MobileNet V2 model was conducted on Google Colab, harnessing cloud-based GPU resources to facilitate a rapid and cost-effective learning process. The model was refined with an extensive dataset of annotated beach debris, optimizing the parameters using the Adam optimizer and a cross-entropy loss function to achieve high-precision trash detection. This capability allows the machine to intelligently categorize and target waste, leading to more effective cleaning operations. This paper details the design and functionality of the beach cleaning machine, emphasizing its autonomous operational capabilities and the novel application of AI in environmental robotics. The results showcase the potential of such technology to fill existing gaps in beach maintenance, offering a scalable and eco-friendly solution to the growing problem of coastal pollution. The deployment of this machine represents a significant advancement in the field, setting a new standard for the integration of autonomous systems in the service of environmental stewardship.Keywords: autonomous beach cleaning machine, renewable energy systems, coastal management, environmental robotics
Procedia PDF Downloads 27609 Effectiveness of Project Grit in Building Resilience among At-Risk Adolescents: A Case Study
Authors: Narash Narasimman, Calvin Leong Jia Jun, Raksha Karthik, Paul Englert
Abstract:
Background: Project Grit, a 12-week youth resilience program implemented by Impart and Spartans Boxing Club, aimed to help at-risk adolescents develop resilience through psychoeducation and mental health techniques for dealing with everyday stressors and adversity. The programme consists of two parts-1.5 hours of group therapy followed by 1 hour of boxing. Due to the novelty of the study, 6 male participants, aged 13 to 18, were recruited to participate in the study. Aim: This case study aims to examine the effectiveness of Project Grit in building resilience among at-risk adolescents. Methods: A case study design was employed to capture the complexity and uniqueness of the intervention, without oversimplifying or generalizing it. A 15-year-old male participant with a history of behavioural challenges, delinquency and gang involvement was selected for the study. Teacher, parent and child versions of the Strengths and Difficulties Questionnaire (SDQ) were administered to the facilitators, parents and participants respectively before and after the programme. Relevant themes from the qualitative interviews will be discussed. Results: Scores from all raters revealed improvements in most domains of the SDQ. Total difficulties scores across all raters improved from “very high” to “close to average”. High interrater reliability was observed (κ= .81). The participant reported learning methods to effectively deal with his everyday concerns using healthy coping strategies, developing a supportive social network, and building on his self efficacy. Themes from the subject’s report concurred with the improvement in SDQ scores. Conclusions: The findings suggest that Project Grit is a promising intervention for promoting resilience among at-risk adolescents. The teleological behaviourism framework and the combination of sports engagement and future orientation may be particularly effective in fostering resilience among this population. Further studies need to be conducted with a larger sample size to further validate the effectiveness of Project Grit.Keywords: resilience, project grit, adolescents, at-risk, boxing, future orientation
Procedia PDF Downloads 61608 Discerning Divergent Nodes in Social Networks
Authors: Mehran Asadi, Afrand Agah
Abstract:
In data mining, partitioning is used as a fundamental tool for classification. With the help of partitioning, we study the structure of data, which allows us to envision decision rules, which can be applied to classification trees. In this research, we used online social network dataset and all of its attributes (e.g., Node features, labels, etc.) to determine what constitutes an above average chance of being a divergent node. We used the R statistical computing language to conduct the analyses in this report. The data were found on the UC Irvine Machine Learning Repository. This research introduces the basic concepts of classification in online social networks. In this work, we utilize overfitting and describe different approaches for evaluation and performance comparison of different classification methods. In classification, the main objective is to categorize different items and assign them into different groups based on their properties and similarities. In data mining, recursive partitioning is being utilized to probe the structure of a data set, which allow us to envision decision rules and apply them to classify data into several groups. Estimating densities is hard, especially in high dimensions, with limited data. Of course, we do not know the densities, but we could estimate them using classical techniques. First, we calculated the correlation matrix of the dataset to see if any predictors are highly correlated with one another. By calculating the correlation coefficients for the predictor variables, we see that density is strongly correlated with transitivity. We initialized a data frame to easily compare the quality of the result classification methods and utilized decision trees (with k-fold cross validation to prune the tree). The method performed on this dataset is decision trees. Decision tree is a non-parametric classification method, which uses a set of rules to predict that each observation belongs to the most commonly occurring class label of the training data. Our method aggregates many decision trees to create an optimized model that is not susceptible to overfitting. When using a decision tree, however, it is important to use cross-validation to prune the tree in order to narrow it down to the most important variables.Keywords: online social networks, data mining, social cloud computing, interaction and collaboration
Procedia PDF Downloads 157607 Multifunctional Epoxy/Carbon Laminates Containing Carbon Nanotubes-Confined Paraffin for Thermal Energy Storage
Authors: Giulia Fredi, Andrea Dorigato, Luca Fambri, Alessandro Pegoretti
Abstract:
Thermal energy storage (TES) is the storage of heat for later use, thus filling the gap between energy request and supply. The most widely used materials for TES are the organic solid-liquid phase change materials (PCMs), such as paraffin. These materials store/release a high amount of latent heat thanks to their high specific melting enthalpy, operate in a narrow temperature range and have a tunable working temperature. However, they suffer from a low thermal conductivity and need to be confined to prevent leakage. These two issues can be tackled by confining PCMs with carbon nanotubes (CNTs). TES applications include the buildings industry, solar thermal energy collection and thermal management of electronics. In most cases, TES systems are an additional component to be added to the main structure, but if weight and volume savings are key issues, it would be advantageous to embed the TES functionality directly in the structure. Such multifunctional materials could be employed in the automotive industry, where the diffusion of lightweight structures could complicate the thermal management of the cockpit environment or of other temperature sensitive components. This work aims to produce epoxy/carbon structural laminates containing CNT-stabilized paraffin. CNTs were added to molten paraffin in a fraction of 10 wt%, as this was the minimum amount at which no leakage was detected above the melting temperature (45°C). The paraffin/CNT blend was cryogenically milled to obtain particles with an average size of 50 µm. They were added in various percentages (20, 30 and 40 wt%) to an epoxy/hardener formulation, which was used as a matrix to produce laminates through a wet layup technique, by stacking five plies of a plain carbon fiber fabric. The samples were characterized microstructurally, thermally and mechanically. Differential scanning calorimetry (DSC) tests showed that the paraffin kept its ability to melt and crystallize also in the laminates, and the melting enthalpy was almost proportional to the paraffin weight fraction. These thermal properties were retained after fifty heating/cooling cycles. Laser flash analysis showed that the thermal conductivity through the thickness increased with an increase of the PCM, due to the presence of CNTs. The ability of the developed laminates to contribute to the thermal management was also assessed by monitoring their cooling rates through a thermal camera. Three-point bending tests showed that the flexural modulus was only slightly impaired by the presence of the paraffin/CNT particles, while a more sensible decrease of the stress and strain at break and the interlaminar shear strength was detected. Optical and scanning electron microscope images revealed that these could be attributed to the preferential location of the PCM in the interlaminar region. These results demonstrated the feasibility of multifunctional structural TES composites and highlighted that the PCM size and distribution affect the mechanical properties. In this perspective, this group is working on the encapsulation of paraffin in a sol-gel derived organosilica shell. Submicron spheres have been produced, and the current activity focuses on the optimization of the synthesis parameters to increase the emulsion efficiency.Keywords: carbon fibers, carbon nanotubes, lightweight materials, multifunctional composites, thermal energy storage
Procedia PDF Downloads 160606 System Dietadhoc® - A Fusion of Human-Centred Design and Agile Development for the Explainability of AI Techniques Based on Nutritional and Clinical Data
Authors: Michelangelo Sofo, Giuseppe Labianca
Abstract:
In recent years, the scientific community's interest in the exploratory analysis of biomedical data has increased exponentially. Considering the field of research of nutritional biologists, the curative process, based on the analysis of clinical data, is a very delicate operation due to the fact that there are multiple solutions for the management of pathologies in the food sector (for example can recall intolerances and allergies, management of cholesterol metabolism, diabetic pathologies, arterial hypertension, up to obesity and breathing and sleep problems). In this regard, in this research work a system was created capable of evaluating various dietary regimes for specific patient pathologies. The system is founded on a mathematical-numerical model and has been created tailored for the real working needs of an expert in human nutrition using the human-centered design (ISO 9241-210), therefore it is in step with continuous scientific progress in the field and evolves through the experience of managed clinical cases (machine learning process). DietAdhoc® is a decision support system nutrition specialists for patients of both sexes (from 18 years of age) developed with an agile methodology. Its task consists in drawing up the biomedical and clinical profile of the specific patient by applying two algorithmic optimization approaches on nutritional data and a symbolic solution, obtained by transforming the relational database underlying the system into a deductive database. For all three solution approaches, particular emphasis has been given to the explainability of the suggested clinical decisions through flexible and customizable user interfaces. Furthermore, the system has multiple software modules based on time series and visual analytics techniques that allow to evaluate the complete picture of the situation and the evolution of the diet assigned for specific pathologies.Keywords: medical decision support, physiological data extraction, data driven diagnosis, human centered AI, symbiotic AI paradigm
Procedia PDF Downloads 23605 The Effect of the Base Computer Method on Repetitive Behaviors and Communication Skills
Authors: Hoorieh Darvishi, Rezaei
Abstract:
Introduction: This study investigates the efficacy of computer-based interventions for children with Autism Spectrum Disorder , specifically targeting communication deficits and repetitive behaviors. The research evaluates novel software applications designed to enhance narrative capabilities and sensory integration through structured, progressive intervention protocols Method: The study evaluated two intervention software programs designed for children with autism, focusing on narrative speech and sensory integration. Twelve children aged 5-11 participated in the two-month intervention, attending three 45-minute weekly sessions, with pre- and post-tests measuring speech, communication, and behavioral outcomes. The narrative speech software incorporated 14 stories using the Cohen model. It progressively reduced software assistance as children improved their storytelling abilities, ultimately enabling independent narration. The process involved story comprehension questions and guided story completion exercises. The sensory integration software featured approximately 100 exercises progressing from basic classification to complex cognitive tasks. The program included attention exercises, auditory memory training (advancing from single to four-syllable words), problem-solving, decision-making, reasoning, working memory, and emotion recognition activities. Each module was accompanied by frequency and pitch-adjusted music that child enjoys it to enhance learning through multiple sensory channels (visual, auditory, and tactile). Conclusion: The results indicated that the use of these software programs significantly improved communication and narrative speech scores in children, while also reducing scores related to repetitive behaviors. Findings: These findings highlight the positive impact of computer-based interventions on enhancing communication skills and reducing repetitive behaviors in children with autism.Keywords: autism, communication_skills, repetitive_behaviors, sensory_integration
Procedia PDF Downloads 9604 Prospective Mathematics Teachers' Content Knowledge on the Definition of Limit and Derivative
Authors: Reyhan Tekin Sitrava
Abstract:
Teachers should have robust and comprehensive content knowledge for effective mathematics teaching. It was explained that content knowledge includes knowing the facts, truths, and concepts; explaining the reasons behind these facts, truths and concepts, and making relationship between the concepts and other disciplines. By virtue of its importance, it will be significant to explore teachers and prospective teachers’ content knowledge related to variety of topics in mathematics. From this point of view, the purpose of this study was to investigate prospective mathematics teachers’ content knowledge. Particularly, it was aimed to reveal the prospective teachers’ knowledge regarding the definition of limit and derivate. To achieve the purpose and to get in-depth understanding, a qualitative case study method was used. The data was collected from 34 prospective mathematics teachers through a questionnaire containing 2 questions. The first question required the prospective teachers to define the limit and the second one required to define the derivative. The data was analyzed using content analysis method. Based on the analysis of the data, although half of the prospective teachers (50%) could write the definition of the limit, nine prospective teachers (26.5%) could not define limit. However, eight prospective teachers’ definition was regarded as partially correct. On the other hand, twenty-seven prospective teachers (79.5%) could define derivative, but seven of them (20.5%) defined it partially. According to the findings, most of the prospective teachers have robust content knowledge on limit and derivative. This result is important because definitions have a virtual role in learning and teaching of mathematics. More specifically, definition is starting point to understand the meaning of a concept. From this point of view, prospective teachers should know the definitions of the concepts to be able to teach them correctly to the students. In addition, they should have knowledge about the relationship between limit and derivative so that they can explain these concepts conceptually. Otherwise, students may memorize the rules of calculating the derivative and the limit. In conclusion, the present study showed that most of the prospective mathematics teachers had enough knowledge about the definition of derivative and limit. However, the rest of them should learn their definition conceptually. The examples of correct, partially correct, and incorrect definition of both concepts will be presented and discussed based on participants’ statements. This study has some implications for instructors. Instructors should be careful about whether students learn the definition of these concepts or not. In order to this, the instructors may give prospective teachers opportunities to discuss the definition of these concepts and the relationship between the concepts.Keywords: content knowledge, derivative, limit, prospective mathematics teachers
Procedia PDF Downloads 221603 The Effect of the Base Computer Method on Repetitive Behaviors and Communication Skills
Authors: Hoorieh Darvishi, Rezaei
Abstract:
Introduction: This study investigates the efficacy of computer-based interventions for children with Autism Spectrum Disorder , specifically targeting communication deficits and repetitive behaviors. The research evaluates novel software applications designed to enhance narrative capabilities and sensory integration through structured, progressive intervention protocols Method: The study evaluated two intervention software programs designed for children with autism, focusing on narrative speech and sensory integration. Twelve children aged 5-11 participated in the two-month intervention, attending three 45-minute weekly sessions, with pre- and post-tests measuring speech, communication, and behavioral outcomes. The narrative speech software incorporated 14 stories using the Cohen model. It progressively reduced software assistance as children improved their storytelling abilities, ultimately enabling independent narration. The process involved story comprehension questions and guided story completion exercises. The sensory integration software featured approximately 100 exercises progressing from basic classification to complex cognitive tasks. The program included attention exercises, auditory memory training (advancing from single to four-syllable words), problem-solving, decision-making, reasoning, working memory, and emotion recognition activities. Each module was accompanied by frequency and pitch-adjusted music that child enjoys it to enhance learning through multiple sensory channels (visual, auditory, and tactile). Conclusion: The results indicated that the use of these software programs significantly improved communication and narrative speech scores in children, while also reducing scores related to repetitive behaviors. Findings: These findings highlight the positive impact of computer-based interventions on enhancing communication skills and reducing repetitive behaviors in children with autism.Keywords: autism, narrative speech, persian, SI, repetitive behaviors, communication
Procedia PDF Downloads 9602 Rheological and Microstructural Characterization of Concentrated Emulsions Prepared by Fish Gelatin
Authors: Helen S. Joyner (Melito), Mohammad Anvari
Abstract:
Concentrated emulsions stabilized by proteins are systems of great importance in food, pharmaceutical and cosmetic products. Controlling emulsion rheology is critical for ensuring desired properties during formation, storage, and consumption of emulsion-based products. Studies on concentrated emulsions have focused on rheology of monodispersed systems. However, emulsions used for industrial applications are polydispersed in nature, and this polydispersity is regarded as an important parameter that also governs the rheology of the concentrated emulsions. Therefore, the objective of this study was to characterize rheological (small and large deformation behaviors) and microstructural properties of concentrated emulsions which were not truly monodispersed as usually encountered in food products such as margarines, mayonnaise, creams, spreads, and etc. The concentrated emulsions were prepared at different concentrations of fish gelatin (0.2, 0.4, 0.8% w/v in the whole emulsion system), oil-water ratio 80-20 (w/w), homogenization speed 10000 rpm, and 25oC. Confocal laser scanning microscopy (CLSM) was used to determine the microstructure of the emulsions. To prepare samples for CLSM analysis, FG solutions were stained by Fluorescein isothiocyanate dye. Emulsion viscosity profiles were determined using shear rate sweeps (0.01 to 100 1/s). The linear viscoelastic regions (LVRs) of the emulsions were determined using strain sweeps (0.01 to 100% strain) for each sample. Frequency sweeps were performed in the LVR (0.1% strain) from 0.6 to 100 rad/s. Large amplitude oscillatory shear (LAOS) testing was conducted by collecting raw waveform data at 0.05, 1, 10, and 100% strain at 4 different frequencies (0.5, 1, 10, and 100 rad/s). All measurements were performed in triplicate at 25oC. The CLSM results revealed that increased fish gelatin concentration resulted in more stable oil-in-water emulsions with homogeneous, finely dispersed oil droplets. Furthermore, the protein concentration had a significant effect on emulsion rheological properties. Apparent viscosity and dynamic moduli at small deformations increased with increasing fish gelatin concentration. These results were related to increased inter-droplet network connections caused by increased fish gelatin adsorption at the surface of oil droplets. Nevertheless, all samples showed shear-thinning and weak gel behaviors over shear rate and frequency sweeps, respectively. Lissajous plots, or plots of stress versus strain, and phase lag values were used to determine nonlinear behavior of the emulsions in LAOS testing. Greater distortion in the elliptical shape of the plots followed by higher phase lag values was observed at large strains and frequencies in all samples, indicating increased nonlinear behavior. Shifts from elastic-dominated to viscous dominated behavior were also observed. These shifts were attributed to damage to the sample microstructure (e.g. gel network disruption), which would lead to viscous-type behaviors such as permanent deformation and flow. Unlike the small deformation results, the LAOS behavior of the concentrated emulsions was not dependent on fish gelatin concentration. Systems with different microstructures showed similar nonlinear viscoelastic behaviors. The results of this study provided valuable information that can be used to incorporate concentrated emulsions in emulsion-based food formulations.Keywords: concentrated emulsion, fish gelatin, microstructure, rheology
Procedia PDF Downloads 275601 Using Industry Projects to Modernize Business Education
Authors: Marie Sams, Kate Barnett-Richards, Jacqui Speculand, Gemma Tombs
Abstract:
Business education in the United Kingdom has seen a number of improvements over the years in moving from delivering traditional chalk and talk lectures to using digital technologies and inviting guest lectures from industry to deliver sessions for students. Engaging topical industry talks to enhance course delivery is generally seen as a positive aspect of enhancing curriculum, however it is acknowledged that perhaps there are better ways in which industry can contribute to the quality of business programmes. Additionally, there is a consensus amongst UK industry managers that a bigger involvement in designing and inputting into business curriculum will have a greater impact on the quality of business ready graduates. Funded by the Disruptive Media Learning Lab at Coventry University in the UK, a project (SOPI - Student Online Projects with Industry) was initiated to enable students to work in project teams to respond and engage with real problems and challenges faced by five managers in various industries including retail, events and manufacturing. Over a semester, approximately 200 students were given the opportunity to develop their management, facilitation, problem solving and reflective skills, whilst having some exposure to real challenges in industry with a focus on supply chain and project management. Face to face seminars were re-designed to enable students to work on live issues in a competitive environment, and were guided to consider the theoretical aspects of their module delivery to underpin the solutions that they were generating. Dialogue between student groups and managers took place using Google+ community; an online social media tool which enables private discussions to take place and can be accessed on mobile devices. Results of the project will be shared in how this development has added value to students experience and understanding of the two subject areas. Student reflections will be analysed and evaluated to assess how the project has contributed to their perception of how the theoretical nature of these two business subjects are applied in practical situations.Keywords: business, education, industry, projects
Procedia PDF Downloads 183600 Algorithm for Improved Tree Counting and Detection through Adaptive Machine Learning Approach with the Integration of Watershed Transformation and Local Maxima Analysis
Authors: Jigg Pelayo, Ricardo Villar
Abstract:
The Philippines is long considered as a valuable producer of high value crops globally. The country’s employment and economy have been dependent on agriculture, thus increasing its demand for the efficient agricultural mechanism. Remote sensing and geographic information technology have proven to effectively provide applications for precision agriculture through image-processing technique considering the development of the aerial scanning technology in the country. Accurate information concerning the spatial correlation within the field is very important for precision farming of high value crops, especially. The availability of height information and high spatial resolution images obtained from aerial scanning together with the development of new image analysis methods are offering relevant influence to precision agriculture techniques and applications. In this study, an algorithm was developed and implemented to detect and count high value crops simultaneously through adaptive scaling of support vector machine (SVM) algorithm subjected to object-oriented approach combining watershed transformation and local maxima filter in enhancing tree counting and detection. The methodology is compared to cutting-edge template matching algorithm procedures to demonstrate its effectiveness on a demanding tree is counting recognition and delineation problem. Since common data and image processing techniques are utilized, thus can be easily implemented in production processes to cover large agricultural areas. The algorithm is tested on high value crops like Palm, Mango and Coconut located in Misamis Oriental, Philippines - showing a good performance in particular for young adult and adult trees, significantly 90% above. The s inventories or database updating, allowing for the reduction of field work and manual interpretation tasks.Keywords: high value crop, LiDAR, OBIA, precision agriculture
Procedia PDF Downloads 402599 A Genre-Based Approach to the Teaching of Pronunciation
Authors: Marden Silva, Danielle Guerra
Abstract:
Some studies have indicated that pronunciation teaching hasn’t been paid enough attention by teachers regarding EFL contexts. In particular, segmental and suprasegmental features through genre-based approach may be an opportunity on how to integrate pronunciation into a more meaningful learning practice. Therefore, the aim of this project was to carry out a survey on some aspects related to English pronunciation that Brazilian students consider more difficult to learn, thus enabling the discussion of strategies that can facilitate the development of oral skills in English classes by integrating the teaching of phonetic-phonological aspects into the genre-based approach. Notions of intelligibility, fluency and accuracy were proposed by some authors as an ideal didactic sequence. According to their proposals, basic learners should be exposed to activities focused on the notion of intelligibility as well as intermediate students to the notion of fluency, and finally more advanced ones to accuracy practices. In order to test this hypothesis, data collection was conducted during three high school English classes at Federal Center for Technological Education of Minas Gerais (CEFET-MG), in Brazil, through questionnaires and didactic activities, which were recorded and transcribed for further analysis. The genre debate was chosen to facilitate the oral expression of the participants in a freer way, making them answering questions and giving their opinion about a previously selected topic. The findings indicated that basic students demonstrated more difficulty with aspects of English pronunciation than the others. Many of the intelligibility aspects analyzed had to be listened more than once for a better understanding. For intermediate students, the speeches recorded were considerably easier to understand, but nevertheless they found it more difficult to pronounce the words fluently, often interrupting their speech to think about what they were going to say and how they would talk. Lastly, more advanced learners seemed to express their ideas more fluently, but still subtle errors related to accuracy were perceptible in speech, thereby confirming the proposed hypothesis. It was also seen that using genre-based approach to promote oral communication in English classes might be a relevant method, considering the socio-communicative function inherent in the suggested approach.Keywords: EFL, genre-based approach, oral skills, pronunciation
Procedia PDF Downloads 130598 The Noun-Phrase Elements on the Usage of the Zero Article
Authors: Wen Zhen
Abstract:
Compared to content words, function words have been relatively overlooked by English learners especially articles. The article system, to a certain extent, becomes a resistance to know English better, driven by different elements. Three principal factors can be summarized in term of the nature of the articles when referring to the difficulty of the English article system. However, making the article system more complex are difficulties in the second acquisition process, for [-ART] learners have to create another category, causing even most non-native speakers at proficiency level to make errors. According to the sequences of acquisition of the English article, it is showed that the zero article is first acquired and in high inaccuracy. The zero article is often overused in the early stages of L2 acquisition. Although learners at the intermediate level move to underuse the zero article for they realize that the zero article does not cover any case, overproduction of the zero article even occurs among advanced L2 learners. The aim of the study is to investigate noun-phrase factors which give rise to incorrect usage or overuse of the zero article, thus providing suggestions for L2 English acquisition. Moreover, it enables teachers to carry out effective instruction that activate conscious learning of students. The research question will be answered through a corpus-based, data- driven approach to analyze the noun-phrase elements from the semantic context and countability of noun-phrases. Based on the analysis of the International Thurber Thesis corpus, the results show that: (1) Although context of [-definite,-specific] favored the zero article, both[-definite,+specific] and [+definite,-specific] showed less influence. When we reflect on the frequency order of the zero article , prototypicality plays a vital role in it .(2)EFL learners in this study have trouble classifying abstract nouns as countable. We can find that it will bring about overuse of the zero article when learners can not make clear judgements on countability altered from (+definite ) to (-definite).Once a noun is perceived as uncountable by learners, the choice would fall back on the zero article. These findings suggest that learners should be engaged in recognition of the countability of new vocabulary by explaining nouns in lexical phrases and explore more complex aspects such as analysis dependent on discourse.Keywords: noun phrase, zero article, corpus, second language acquisition
Procedia PDF Downloads 253