Search results for: digital learning objects
3083 On-Road Text Detection Platform for Driver Assistance Systems
Authors: Guezouli Larbi, Belkacem Soundes
Abstract:
The automation of the text detection process can help the human in his driving task. Its application can be very useful to help drivers to have more information about their environment by facilitating the reading of road signs such as directional signs, events, stores, etc. In this paper, a system consisting of two stages has been proposed. In the first one, we used pseudo-Zernike moments to pinpoint areas of the image that may contain text. The architecture of this part is based on three main steps, region of interest (ROI) detection, text localization, and non-text region filtering. Then, in the second step, we present a convolutional neural network architecture (On-Road Text Detection Network - ORTDN) which is considered a classification phase. The results show that the proposed framework achieved ≈ 35 fps and an mAP of ≈ 90%, thus a low computational time with competitive accuracy.Keywords: text detection, CNN, PZM, deep learning
Procedia PDF Downloads 833082 A Comparative Analysis of Liberation and Contemplation in Sankara and Aquinas
Authors: Zeite Shumneiyang Koireng
Abstract:
Liberation is the act of liberating or the state of being liberated. Indian philosophy, in general, understands liberation as moksa, which etymological is derived from the Sanskrit root muc+ktin meaning to loose, set free, to let go, discharge, release, liberate, deliver, etc. According to Indian schools of thought, moksa is the highest value on realizing which nothing remains to be realized. It is the cessation of birth and death, all kinds of pain and at the same time, it is the realization of one’s own self. Sankara’s Advaita philosophy is based on the following propositions: Brahman is the only Reality; the world has apparent reality, and the soul is not different from Brahman. According to Sankara, Brahman is the basis on which the world form appears; it is the sustaining ground of all various modification. It is the highest self and the self of all reveals himself by dividing himself [ as it was in the form of various objects] in multiple ways. The whole world is the manifestation of the Supreme Being. Brahman modifying itself into the Atman or internal self of all things is the world. Since Brahman is the Upadhana karana of the world, the sruti speaks of the world as the modification of Brahman into the Atman of the effect. Contemplation as the fulfillment of man finds a radical foundation in Aquinas teaching concerning the natural end or as he also referred to it, natural desire. The third book of the Summa Contra Gentiles begins the study of happiness with a consideration of natural desire. According to him, all creatures, even those devoid of understanding are ordered to God as an ultimate end. Intrinsically, a part of every nature is a tendency or inclination, originating in the natural form and tendency toward the end for which the possessor of nature exists. It is the study of the nature and finality of inclination that Aquinas establishes through an argument of induction man’s Contemplation of God as the fulfillment of his nature. The present paper is attempted to critically approach two important, seminal and originated thought, representing Indian and Western traditions which mark on the thinking of their respective times. Both these thoughts- Advaitic concept of Liberation in the Indian tradition and the concept of Contemplation in Thomas Aquinas’ Summa Contra Gentiles’- confront directly the question of the ultimate meaning of human existence. According to Sankara, it is knowledge and knowledge alone which is the means of moksa and the highest knowledge is moksa itself. Liberation in Sankara Vedanta is attained as a process of purification of self, which gradually and increasingly turns into purer and purer intentional construction. Man’s inner natural tendency for Aquinas is towards knowledge. The human subject is driven to know more and more about reality and in particular about the highest reality. Contemplation of this highest reality is fulfillment in the philosophy of Aquinas. Rather, Contemplation is the perfect activity in man’s present state of existence.Keywords: liberation, Brahman, contemplation, fulfillment
Procedia PDF Downloads 1933081 Effects of Evening vs. Morning Training on Motor Skill Consolidation in Morning-Oriented Elderly
Authors: Maria Korman, Carmit Gal, Ella Gabitov, Avi Karni
Abstract:
The main question addressed in this study was whether the time-of-day wherein training is afforded is a significant factor for motor skill ('how-to', procedural knowledge) acquisition and consolidation into long term memory in the healthy elderly population. Twenty-nine older adults (60-75 years) practiced an explicitly instructed 5-element key-press sequence by repeatedly generating the sequence ‘as fast and accurately as possible’. Contribution of three parameters to acquisition, 24h post-training consolidation, and 1-week retention gains in motor sequence speed was assessed: (a) time of training (morning vs. evening group) (b) sleep quality (actigraphy) and (c) chronotype. All study participants were moderately morning type, according to the Morningness-Eveningness Questionnaire score. All participants had sleep patterns typical of age, with average sleep efficiency of ~ 82%, and approximately 6 hours of sleep. Speed of motor sequence performance in both groups improved to a similar extent during training session. Nevertheless, evening group expressed small but significant overnight consolidation phase gains, while morning group showed only maintenance of performance level attained at the end of training. By 1-week retention test, both groups showed similar performance levels with no significant gains or losses with respect to 24h test. Changes in the tapping patterns at 24h and 1-week post-training were assessed based on normalized Pearson correlation coefficients using the Fisher’s z-transformation in reference to the tapping pattern attained at the end of the training. Significant differences between the groups were found: the evening group showed larger changes in tapping patterns across the consolidation and retention windows. Our results show that morning-oriented older adults effectively acquired, consolidated, and maintained a new sequence of finger movements, following both morning and evening practice sessions. However, time-of-training affected the time-course of skill evolution in terms of performance speed, as well as the re-organization of tapping patterns during the consolidation period. These results are in line with the notion that motor training preceding a sleep interval may be beneficial for the long-term memory in the elderly. Evening training should be considered an appropriate time window for motor skill learning in older adults, even in individuals with morning chronotype.Keywords: time-of-day, elderly, motor learning, memory consolidation, chronotype
Procedia PDF Downloads 1353080 Improving Low English Oral Skills of 5 Second-Year English Major Students at Debark University
Authors: Belyihun Muchie
Abstract:
This study investigates the low English oral communication skills of 5 second-year English major students at Debark University. It aims to identify the key factors contributing to their weaknesses and propose effective interventions to improve their spoken English proficiency. Mixed-methods research will be employed, utilizing observations, questionnaires, and semi-structured interviews to gather data from the participants. To clearly identify these factors, structured and informal observations will be employed; the former will be used to identify their fluency, pronunciation, vocabulary use, and grammar accuracy, and the later will be suited to observe the natural interactions and communication patterns of learners in the classroom setting. The questionnaires will assess their self-perceptions of their skills, perceived barriers to fluency, and preferred learning styles. Interviews will also delve deeper into their experiences and explore specific obstacles faced in oral communication. Data analysis will involve both quantitative and qualitative responses. The structured observation and questionnaire will be analyzed quantitatively, whereas the informal observation and interview transcripts will be analyzed thematically. Findings will be used to identify the major causes of low oral communication skills, such as limited vocabulary, grammatical errors, pronunciation difficulties, or lack of confidence. They are also helpful to develop targeted solutions addressing these causes, such as intensive pronunciation practice, conversation simulations, personalized feedback, or anxiety-reduction techniques. Finally, the findings will guide designing an intervention plan for implementation during the action research phase. The study's outcomes are expected to provide valuable insights into the challenges faced by English major students in developing oral communication skills, contribute to the development of evidence-based interventions for improving spoken English proficiency in similar contexts, and offer practical recommendations for English language instructors and curriculum developers to enhance student learning outcomes. By addressing the specific needs of these students and implementing tailored interventions, this research aims to bridge the gap between theoretical knowledge and practical speaking ability, equipping them with the confidence and skills to flourish in English communication settings.Keywords: oral communication skills, mixed-methods, evidence-based interventions, spoken English proficiency
Procedia PDF Downloads 513079 Deep Learning Based on Image Decomposition for Restoration of Intrinsic Representation
Authors: Hyohun Kim, Dongwha Shin, Yeonseok Kim, Ji-Su Ahn, Kensuke Nakamura, Dongeun Choi, Byung-Woo Hong
Abstract:
Artefacts are commonly encountered in the imaging process of clinical computed tomography (CT) where the artefact refers to any systematic discrepancy between the reconstructed observation and the true attenuation coefficient of the object. It is known that CT images are inherently more prone to artefacts due to its image formation process where a large number of independent detectors are involved, and they are assumed to yield consistent measurements. There are a number of different artefact types including noise, beam hardening, scatter, pseudo-enhancement, motion, helical, ring, and metal artefacts, which cause serious difficulties in reading images. Thus, it is desired to remove nuisance factors from the degraded image leaving the fundamental intrinsic information that can provide better interpretation of the anatomical and pathological characteristics. However, it is considered as a difficult task due to the high dimensionality and variability of data to be recovered, which naturally motivates the use of machine learning techniques. We propose an image restoration algorithm based on the deep neural network framework where the denoising auto-encoders are stacked building multiple layers. The denoising auto-encoder is a variant of a classical auto-encoder that takes an input data and maps it to a hidden representation through a deterministic mapping using a non-linear activation function. The latent representation is then mapped back into a reconstruction the size of which is the same as the size of the input data. The reconstruction error can be measured by the traditional squared error assuming the residual follows a normal distribution. In addition to the designed loss function, an effective regularization scheme using residual-driven dropout determined based on the gradient at each layer. The optimal weights are computed by the classical stochastic gradient descent algorithm combined with the back-propagation algorithm. In our algorithm, we initially decompose an input image into its intrinsic representation and the nuisance factors including artefacts based on the classical Total Variation problem that can be efficiently optimized by the convex optimization algorithm such as primal-dual method. The intrinsic forms of the input images are provided to the deep denosing auto-encoders with their original forms in the training phase. In the testing phase, a given image is first decomposed into the intrinsic form and then provided to the trained network to obtain its reconstruction. We apply our algorithm to the restoration of the corrupted CT images by the artefacts. It is shown that our algorithm improves the readability and enhances the anatomical and pathological properties of the object. The quantitative evaluation is performed in terms of the PSNR, and the qualitative evaluation provides significant improvement in reading images despite degrading artefacts. The experimental results indicate the potential of our algorithm as a prior solution to the image interpretation tasks in a variety of medical imaging applications. This work was supported by the MISP(Ministry of Science and ICT), Korea, under the National Program for Excellence in SW (20170001000011001) supervised by the IITP(Institute for Information and Communications Technology Promotion).Keywords: auto-encoder neural network, CT image artefact, deep learning, intrinsic image representation, noise reduction, total variation
Procedia PDF Downloads 1903078 Towards Creative Movie Title Generation Using Deep Neural Models
Authors: Simon Espigolé, Igor Shalyminov, Helen Hastie
Abstract:
Deep machine learning techniques including deep neural networks (DNN) have been used to model language and dialogue for conversational agents to perform tasks, such as giving technical support and also for general chit-chat. They have been shown to be capable of generating long, diverse and coherent sentences in end-to-end dialogue systems and natural language generation. However, these systems tend to imitate the training data and will only generate the concepts and language within the scope of what they have been trained on. This work explores how deep neural networks can be used in a task that would normally require human creativity, whereby the human would read the movie description and/or watch the movie and come up with a compelling, interesting movie title. This task differs from simple summarization in that the movie title may not necessarily be derivable from the content or semantics of the movie description. Here, we train a type of DNN called a sequence-to-sequence model (seq2seq) that takes as input a short textual movie description and some information on e.g. genre of the movie. It then learns to output a movie title. The idea is that the DNN will learn certain techniques and approaches that the human movie titler may deploy that may not be immediately obvious to the human-eye. To give an example of a generated movie title, for the movie synopsis: ‘A hitman concludes his legacy with one more job, only to discover he may be the one getting hit.’; the original, true title is ‘The Driver’ and the one generated by the model is ‘The Masquerade’. A human evaluation was conducted where the DNN output was compared to the true human-generated title, as well as a number of baselines, on three 5-point Likert scales: ‘creativity’, ‘naturalness’ and ‘suitability’. Subjects were also asked which of the two systems they preferred. The scores of the DNN model were comparable to the scores of the human-generated movie title, with means m=3.11, m=3.12, respectively. There is room for improvement in these models as they were rated significantly less ‘natural’ and ‘suitable’ when compared to the human title. In addition, the human-generated title was preferred overall 58% of the time when pitted against the DNN model. These results, however, are encouraging given the comparison with a highly-considered, well-crafted human-generated movie title. Movie titles go through a rigorous process of assessment by experts and focus groups, who have watched the movie. This process is in place due to the large amount of money at stake and the importance of creating an effective title that captures the audiences’ attention. Our work shows progress towards automating this process, which in turn may lead to a better understanding of creativity itself.Keywords: creativity, deep machine learning, natural language generation, movies
Procedia PDF Downloads 3263077 The Review of Permanent Downhole Monitoring System
Abstract:
With the increasingly difficult development and operating environment of exploration, there are many new challenges and difficulties in developing and exploiting oil and gas resources. These include the ability to dynamically monitor wells and provide data and assurance for the completion and production of high-cost and complex wells. A key technology in providing these assurances and maximizing oilfield profitability is real-time permanent reservoir monitoring. The emergence of optical fiber sensing systems has gradually begun to replace traditional electronic systems. Traditional temperature sensors can only achieve single-point temperature monitoring, but fiber optic sensing systems based on the Bragg grating principle have a high level of reliability, accuracy, stability, and resolution, enabling cost-effective monitoring, which can be done in real-time, anytime, and without well intervention. Continuous data acquisition is performed along the entire wellbore. The integrated package with the downhole pressure gauge, packer, and surface system can also realize real-time dynamic monitoring of the pressure in some sections of the downhole, avoiding oil well intervention and eliminating the production delay and operational risks of conventional surveys. Real-time information obtained through permanent optical fibers can also provide critical reservoir monitoring data for production and recovery optimization.Keywords: PDHM, optical fiber, coiled tubing, photoelectric composite cable, digital-oilfield
Procedia PDF Downloads 793076 Between the House and the City: An Investigation of the Structure of the Family/Society and the Role of the Public Housing in Tokyo and Berlin
Authors: Abudjana Babiker
Abstract:
The middle of twenty century witnessed an explosion in public housing. After the great depression, some of the capitalists and communist countries have launched policies and programs to produce public housing in the urban areas. Concurrently, modernity was the leading architecture style at the time excessively supported the production, and principally was the instrument for the success of the public housing program due to the modernism manifesto for manufactured architecture as an international style that serves the society and parallelly connect it to the other design industries which allowed for the production of the architecture elements. After the second world war, public housing flourished, especially in communist’s countries. The idea of public housing was conceived as living spaces at the time, while the Workplaces performed as the place for production and labor. Michel Foucault - At the end of the twenty century- the introduction of biopolitics has had highlighted the alteration in the production and labor inter-function. The house does not precisely perform as the sanctuary, from the production, for the family, it opens the house to be -part of the city as- a space for production, not only to produce objects but to reproduce the family as a total part of the production mechanism in the city. While the public housing kept altering from one country to another after the failure of the modernist’s public housing in the late 1970s, the society continued changing parallelly with the socio-economic condition in each political-economical system, and the public housing thus followed. The family structure in the major cities has been dramatically changing, single parenting and the long working hours, for instance, have been escalating the loneliness in the major cities such as London, Berlin, and Tokyo and the public housing for the families is no longer suits the single lifestyle for the individuals. This Paper investigates the performance of both the single/individual lifestyle and the family/society structure in Tokyo and Berlin in a relation to the utilization of public housing under economical policies and the socio-political environment that produced the individuals and the collective. The study is carried through the study of the undercurrent individual/society and case studies to examine the performance of the utilization of the housing. The major finding is that the individual/collective are revolving around the city; the city identified and acts as a system that magnetized and blurred the line between production and reproduction lifestyle. The mass public housing for families is shifting to be a combination between neo-liberalism and socialism housing.Keywords: loneliness, production reproduction, work live, publichousing
Procedia PDF Downloads 1853075 Gesture-Controlled Interface Using Computer Vision and Python
Authors: Vedant Vardhan Rathour, Anant Agrawal
Abstract:
The project aims to provide a touchless, intuitive interface for human-computer interaction, enabling users to control their computer using hand gestures and voice commands. The system leverages advanced computer vision techniques using the MediaPipe framework and OpenCV to detect and interpret real time hand gestures, transforming them into mouse actions such as clicking, dragging, and scrolling. Additionally, the integration of a voice assistant powered by the Speech Recognition library allows for seamless execution of tasks like web searches, location navigation and gesture control on the system through voice commands.Keywords: gesture recognition, hand tracking, machine learning, convolutional neural networks
Procedia PDF Downloads 123074 Leveraging Play to Foster Healthy Social-emotional Development in Young Children in Poverty
Authors: Smita Mathur
Abstract:
Play is an innate, player-centric, joyful, fundamental activity of early childhood development that significantly contributes to social, emotional, and academic learning. Leveraging the power of play can enhance these domains by creating engaging, interactive, and developmentally appropriate learning experiences for young children. This research aimed to systematically examine young children’s play behaviors with a focus on four primary objectives: (1) the frequency and duration of on-task behaviors, (2) social interactions and emotional expressions during play, (3) the correlation between academic skills and play, and (4) identifying best practices for integrating play-based curricula. To achieve these objectives, a mixed-method study was conducted among young preschool-aged children in low socio-economic populations in the United States. The children were identified using purposive sampling. The children were observed during structured play in classrooms and unstructured play during outdoor playtime and in their home environments. The study sampled 97 preschool-aged children. A total of 3970 minutes of observations were coded to address the research questions. Thirty-seven percent of children lived in linguistically isolated families, and 76% lived in basic budget poverty. Children lived in overcrowded housing situations (67%), and most families had mixed citizenship status (66%). The observational study was conducted using the observation protocol during the Oxford Study Project. On-task behaviors were measured by tracking the frequency and duration of activities where children maintained focus and engagement. In examining social interactions and emotional expressions, the study recorded social interactions, emotional responses, and teacher involvement during play. The study aimed to identify best practices for integrating play-based curricula into early childhood education. By analyzing the effectiveness of different play-based strategies and their impact on on-task behaviors, social-emotional development, and academic skills, the research sought to provide actionable recommendations for educators and caregivers. The findings from study 1. Highlight play behaviors that increase on-task behaviors and academic, & social skills in young children. 2. Offers insights into teacher preparation and designing play-based curriculum 3. Research critiques observation as a data collection technique.Keywords: play, early childhood education, social-emotional development, academic development
Procedia PDF Downloads 283073 Predicting Personality and Psychological Distress Using Natural Language Processing
Authors: Jihee Jang, Seowon Yoon, Gaeun Son, Minjung Kang, Joon Yeon Choeh, Kee-Hong Choi
Abstract:
Background: Self-report multiple choice questionnaires have been widely utilized to quantitatively measure one’s personality and psychological constructs. Despite several strengths (e.g., brevity and utility), self-report multiple-choice questionnaires have considerable limitations in nature. With the rise of machine learning (ML) and Natural language processing (NLP), researchers in the field of psychology are widely adopting NLP to assess psychological constructs to predict human behaviors. However, there is a lack of connections between the work being performed in computer science and that psychology due to small data sets and unvalidated modeling practices. Aims: The current article introduces the study method and procedure of phase II, which includes the interview questions for the five-factor model (FFM) of personality developed in phase I. This study aims to develop the interview (semi-structured) and open-ended questions for the FFM-based personality assessments, specifically designed with experts in the field of clinical and personality psychology (phase 1), and to collect the personality-related text data using the interview questions and self-report measures on personality and psychological distress (phase 2). The purpose of the study includes examining the relationship between natural language data obtained from the interview questions, measuring the FFM personality constructs, and psychological distress to demonstrate the validity of the natural language-based personality prediction. Methods: The phase I (pilot) study was conducted on fifty-nine native Korean adults to acquire the personality-related text data from the interview (semi-structured) and open-ended questions based on the FFM of personality. The interview questions were revised and finalized with the feedback from the external expert committee, consisting of personality and clinical psychologists. Based on the established interview questions, a total of 425 Korean adults were recruited using a convenience sampling method via an online survey. The text data collected from interviews were analyzed using natural language processing. The results of the online survey, including demographic data, depression, anxiety, and personality inventories, were analyzed together in the model to predict individuals’ FFM of personality and the level of psychological distress (phase 2).Keywords: personality prediction, psychological distress prediction, natural language processing, machine learning, the five-factor model of personality
Procedia PDF Downloads 793072 Technological Advancement in Fashion Online Retailing: A Comparative Study of Pakistan and UK Fashion E-Commerce
Authors: Sadia Idrees, Gianpaolo Vignali, Simeon Gill
Abstract:
The study aims to establish the virtual size and fit technology features to enhance fashion online retailing platforms, utilising digital human measurements to provide customised style and function to consumers. A few firms in the UK have launched advanced interactive fashion shopping domains for personalised shopping globally, aided by the latest internet technology. Virtual size and fit interfaces have a great potential to provide a personalised better-fitted garment to promote mass customisation globally. Made-to-measure clothing, consuming unstitched fabric is a common practice offered by fashion brands in Pakistan. This product is regarded as economical and sustainable to be utilised by consumers in Pakistan. Although the manual sizing system is practiced to sell garments online, virtual size and fit visualisation and recommendation technologies are uncommon in Pakistani fashion interfaces. A comparative assessment of Pakistani fashion brand websites and UK technology-driven fashion interfaces was conducted to highlight the vast potential of the virtual size and fit technology. The results indicated that web 2.0 technology adopted by Pakistani apparel brands has limited features, whereas companies practicing web 3.0 technology provide interactive online real-store shopping experience leading to enhanced customer satisfaction and globalisation of brands.Keywords: e-commerce, mass customization, virtual size and fit, web 3.0 technology
Procedia PDF Downloads 1413071 Developing Cultural Competence as Part of Nursing Studies: Language, Customs and Health Issues
Authors: Mohammad Khatib, Salam Hadid
Abstract:
Introduction: Developing nurses' cultural competence begins in their basic training and requires them to participate in an array of activities which raise their awareness and stimulate their interest, desire and curiosity about different cultures, by creating opportunities for intercultural meetings promoting the concept of 'culture' and its components, including recognition of cultural diversity and the legitimacy of the other. Importantly, professionals need to acquire specific cultural knowledge and thorough understanding of the values, norms, customs, beliefs and symbols of different cultures. Similarly, they need to be given opportunities to practice the verbal and non-verbal communication skills of other cultures according to their cultural codes. Such a system is being implemented as part of nursing studies at Zefat Academic College in two study frameworks; firstly, a course integrating nursing theory and practice in multicultural nursing; secondly, a course in learning the languages spoken in Israel focusing on medical and nursing terminology. Methods: Students participating in the 'Transcultural Nursing' course come from a variety of backgrounds: Jews, or Arabs, religious, or secular; Muslim, Christian, new immigrants, Ethiopians or from other cultural affiliations. They are required to present and discuss cultural practices that affect health. In addition, as part of the language course, students learn and teach their friends 5 spoken languages (Arabic, Russian, Amharian, Yidish, and Sign language) focusing on therapeutic interaction and communication using the vocabulary and concepts necessary for the therapeutic encounter. An evaluation of the process and the results was done using a structured questionnaire which includes series of questions relating to the contributions of the courses to their cultural knowledge, awareness and skills. 155 students completed the questionnaire. Results: A preliminary assessment of this educational system points an increase in cultural awareness and knowledge among the students as well as in their willingness to recognize the other's difference. A positive atmosphere of multiculturalism is reflected in students' mutual interest and respect was created. Students showed a deep understanding of cultural issues relating to health and care (consanguinity and genetics, food customs; cultural events, reincarnation, traditional treatments etc.). Most of the students were willing to recommend the courses to others and suggest some changes relating learning methods (more simulations, role playing and activities).Keywords: cultural competence, nursing education, culture, language
Procedia PDF Downloads 2773070 Neuro-Epigenetic Changes on Diabetes Induced-Synaptic Fidelity in Brain
Authors: Valencia Fernandes, Dharmendra Kumar Khatri, Shashi Bala Singh
Abstract:
Background and Aim: Epigenetics are the inaudible signatures of several pathological processes in the brain. This study understands the influence of DNA methylation, a major epigenetic modification, in the prefrontal cortex and hippocampus of the diabetic brain and its notable effect on the cellular chaperones and synaptic proteins. Method: Chronic high fat diet and STZ-induced diabetic mice were studied for cognitive dysfunction, and global DNA methylation, as well as DNA methyltransferase (DNMT) activity, were assessed. Further, the cellular chaperones and synaptic proteins were examined using DNMT inhibitor, 5-aza-2′-deoxycytidine (5-aza-dC)-via intracerebroventricular injection. Moreover, % methylation of these synaptic proteins were also studied so as to correlate its epigenetic involvement. Computationally, its interaction with the DNMT enzyme were also studied using bioinformatic tools. Histological studies for morphological alterations and neuronal degeneration were also studied. Neurogenesis, a characteristic marker for new learning and memory formation, was also assessed via the BrdU staining. Finally, the most important behavioral studies, including the Morris water maze, Y maze, passive avoidance, and Novel object recognition test, were performed to study its cognitive functions. Results: Altered global DNA methylation and increased levels of DNMTs within the nucleus were confirmed in the cortex and hippocampus of the diseased mice, suggesting hypermethylation at a genetic level. Treatment with AzadC, a global DNA demethylating agent, ameliorated the protein and gene expression of the cellular chaperones and synaptic fidelity. Furthermore, the methylation analysis profile showed hypermethylation of the hsf1 protein, a master regulator for chaperones and thus, confirmed the epigenetic involvement in the diseased brain. Morphological improvements and decreased neurodegeneration, along with enhanced neurogenesis in the treatment group, suggest that epigenetic modulations do participate in learning and memory. This is supported by the improved behavioral test battery seen in the treatment group. Conclusion: DNA methylation could possibly accord in dysregulating the memory-associated proteins at chronic stages in type 2 diabetes. This could suggest a substantial contribution to the underlying pathophysiology of several metabolic syndromes like insulin resistance, obesity and also participate in transitioning this damage centrally, such as cognitive dysfunction.Keywords: epigenetics, cognition, chaperones, DNA methylation
Procedia PDF Downloads 2053069 The Impact of Undisturbed Flow Speed on the Correlation of Aerodynamic Coefficients as a Function of the Angle of Attack for the Gyroplane Body
Authors: Zbigniew Czyz, Krzysztof Skiba, Miroslaw Wendeker
Abstract:
This paper discusses the results of aerodynamic investigation of the Tajfun gyroplane body designed by a Polish company, Aviation Artur Trendak. This gyroplane has been studied as a 1:8 scale model. Scaling objects for aerodynamic investigation is an inherent procedure in any kind of designing. If scaling, the criteria of similarity need to be satisfied. The basic criteria of similarity are geometric, kinematic and dynamic. Despite the results of aerodynamic research are often reduced to aerodynamic coefficients, one should pay attention to how values of coefficients behave if certain criteria are to be satisfied. To satisfy the dynamic criterion, for example, the Reynolds number should be focused on. This is the correlation of inertial to viscous forces. With the multiplied flow speed by the specific dimension as a numerator (with a constant kinematic viscosity coefficient), flow speed in a wind tunnel research should be increased as many times as an object is decreased. The aerodynamic coefficients specified in this research depend on the real forces that act on an object, its specific dimension, medium speed and variations in its density. Rapid prototyping with a 3D printer was applied to create the research object. The research was performed with a T-1 low-speed wind tunnel (its diameter of the measurement volume is 1.5 m) and a six-element aerodynamic internal scales, WDP1, at the Institute of Aviation in Warsaw. This T-1 wind tunnel is low-speed continuous operation with open space measurement. The research covered a number of the selected speeds of undisturbed flow, i.e. V = 20, 30 and 40 m/s, corresponding to the Reynolds numbers (as referred to 1 m) Re = 1.31∙106, 1.96∙106, 2.62∙106 for the angles of attack ranging -15° ≤ α ≤ 20°. Our research resulted in basic aerodynamic characteristics and observing the impact of undisturbed flow speed on the correlation of aerodynamic coefficients as a function of the angle of attack of the gyroplane body. If the speed of undisturbed flow in the wind tunnel changes, the aerodynamic coefficients are significantly impacted. At speed from 20 m/s to 30 m/s, drag coefficient, Cx, changes by 2.4% up to 9.9%, whereas lift coefficient, Cz, changes by -25.5% up to 15.7% if the angle of attack of 0° excluded or by -25.5% up to 236.9% if the angle of attack of 0° included. Within the same speed range, the coefficient of a pitching moment, Cmy, changes by -21.1% up to 7.3% if the angles of attack -15° and -10° excluded or by -142.8% up to 618.4% if the angle of attack -15° and -10° included. These discrepancies in the coefficients of aerodynamic forces definitely need to consider while designing the aircraft. For example, if load of certain aircraft surfaces is calculated, additional correction factors definitely need to be applied. This study allows us to estimate the discrepancies in the aerodynamic forces while scaling the aircraft. This work has been financed by the Polish Ministry of Science and Higher Education.Keywords: aerodynamics, criteria of similarity, gyroplane, research tunnel
Procedia PDF Downloads 3933068 Peers' Alterity in Inverted Inclusion: A Case Study
Authors: Johanna Sagner, María José Sandoval
Abstract:
At the early stages of adolescence, young people, regardless of a disability or not, start to establish closer friendship ties. Unlike previous developmental phases, these ties are rather reciprocal, more committed, and require more time. Friendship ties during adolescence allow the development of social and personal skills, specifically the skills to start constructing identity. In an inclusive context that incorporates young people with a disability, friendship among peers also takes place. Nonetheless, the relation is shaped, among others, by the alterity construction about the other with disability. Research about peers’ relation between young people with and without disability in an inclusive context has shown that the relation tends to become a helper-helpee relation, where those with a disability are seen as people in need. Prejudices about the others’ condition or distancing from the other because of his/hers disability are common. In this sense, the helper-helpee relation, as a non-reciprocal and protective relation, will not promote friendship between classmates, but a rather asymmetric alterity. Our research is an explorative case study that wants to know how the relation between peers is shaped within a different inclusive program, were also the integrated group has special educational needs. Therefore, we analyze from a qualitative and quantitative approach the data of an inverted inclusive program. This is a unique case of a special public school for visual disability in Germany that includes young people from a mainstream school who had learning difficulties. For the research, we analyze data from interviews, focal interviews and open-ended questions with an interpretative phenomenological analysis approach. The questionnaires include a five point Likert scale, for which we calculate the acceptance rate. The findings show that the alterity relation between pupils is less asymmetrical and represents a rather horizontal alterity. The helper-helpee relation is marked by exchange, since both groups have special educational needs and therefore, those with visual disability and those with learning difficulties help each other indistinctly. Friendship is more present among classmates. The horizontal alterity peers’ relation is influenced by a sort of tie, where none of the groups need more or less help than other groups. Both groups identify that they themselves and the other have special needs. The axiological axe of alterity is not of superiority or inferiority, recognizing each other’s differences and otherness. Another influential factor relates with the amount of time they spend together, since the program does not have a resource room or a teacher who teaches parallel lessons. Two probable causes for that rather equal peer relation might be the constellation of fewer pupils per classroom and the differentiated lessons taught by teachers with a special educational formation.Keywords: alterity, disability, inverted inclusion, peers’ relation
Procedia PDF Downloads 3153067 Effect on the Integrity of the DN300 Pipe and Valves in the Cooling Water System Imposed by the Pipes and Ventilation Pipes above in an Earthquake Situation
Authors: Liang Zhang, Gang Xu, Yue Wang, Chen Li, Shao Chong Zhou
Abstract:
Presently, more and more nuclear power plants are facing the issue of life extension. When a nuclear power plant applies for an extension of life, its condition needs to meet the current design standards, which is not fine for all old reactors, typically for seismic design. Seismic-grade equipment in nuclear power plants are now generally placed separately from the non-seismic-grade equipment, but it was not strictly required before. Therefore, it is very important to study whether non-seismic-grade equipment will affect the seismic-grade equipment when dropped down in an earthquake situation, which is related to the safety of nuclear power plants and future life extension applications. This research was based on the cooling water system with the seismic and non-seismic grade equipment installed together, as an example to study whether the non-seismic-grade equipment such as DN50 fire pipes and ventilation pipes arranged above will damage the DN300 pipes and valves arranged below when earthquakes occur. In the study, the simulation was carried out by ANSYS / LY-DYNA, and Johnson-Cook was used as the material model and failure model. For the experiments, the relative positions of objects in the room were restored by 1: 1. In the experiment, the pipes and valves were filled with water with a pressure of 0.785 MPa. The pressure-holding performance of the pipe was used as a criterion for damage. In addition to the pressure-holding performance, the opening torque was considered as well for the valves. The research results show that when the 10-meter-long DN50 pipe was dropped from the position of 8 meters height and the 8-meter-long air pipe dropped from a position of 3.6 meters height, they do not affect the integrity of DN300 pipe below. There is no failure phenomenon in the simulation as well. After the experiment, the pressure drop in two hours for the pipe is less than 0.1%. The main body of the valve does not fail either. The opening torque change after the experiment is less than 0.5%, but the handwheel of the valve may break, which affects the opening actions. In summary, impacts of the upper pipes and ventilation pipes dropdown on the integrity of the DN300 pipes and valves below in a cooling water system of a typical second-generation nuclear power plant under an earthquake was studied. As a result, the functionality of the DN300 pipeline and the valves themselves are not significantly affected, but the handwheel of the valve or similar articles can probably be broken and need to take care.Keywords: cooling water system, earthquake, integrity, pipe and valve
Procedia PDF Downloads 1123066 Development of an Intelligent Decision Support System for Smart Viticulture
Authors: C. M. Balaceanu, G. Suciu, C. S. Bosoc, O. Orza, C. Fernandez, Z. Viniczay
Abstract:
The Internet of Things (IoT) represents the best option for smart vineyard applications, even if it is necessary to integrate the technologies required for the development. This article is based on the research and the results obtained in the DISAVIT project. For Smart Agriculture, the project aims to provide a trustworthy, intelligent, integrated vineyard management solution that is based on the IoT. To have interoperability through the use of a multiprotocol technology (being the future connected wireless IoT) it is necessary to adopt an agnostic approach, providing a reliable environment to address cyber security, IoT-based threats and traceability through blockchain-based design, but also creating a concept for long-term implementations (modular, scalable). The ones described above represent the main innovative technical aspects of this project. The DISAVIT project studies and promotes the incorporation of better management tools based on objective data-based decisions, which are necessary for agriculture adapted and more resistant to climate change. It also exploits the opportunities generated by the digital services market for smart agriculture management stakeholders. The project's final result aims to improve decision-making, performance, and viticulturally infrastructure and increase real-time data accuracy and interoperability. Innovative aspects such as end-to-end solutions, adaptability, scalability, security and traceability, place our product in a favorable situation over competitors. None of the solutions in the market meet every one of these requirements by a unique product being innovative.Keywords: blockchain, IoT, smart agriculture, vineyard
Procedia PDF Downloads 2023065 The Outcome of Using Machine Learning in Medical Imaging
Authors: Adel Edwar Waheeb Louka
Abstract:
Purpose AI-driven solutions are at the forefront of many pathology and medical imaging methods. Using algorithms designed to better the experience of medical professionals within their respective fields, the efficiency and accuracy of diagnosis can improve. In particular, X-rays are a fast and relatively inexpensive test that can diagnose diseases. In recent years, X-rays have not been widely used to detect and diagnose COVID-19. The under use of Xrays is mainly due to the low diagnostic accuracy and confounding with pneumonia, another respiratory disease. However, research in this field has expressed a possibility that artificial neural networks can successfully diagnose COVID-19 with high accuracy. Models and Data The dataset used is the COVID-19 Radiography Database. This dataset includes images and masks of chest X-rays under the labels of COVID-19, normal, and pneumonia. The classification model developed uses an autoencoder and a pre-trained convolutional neural network (DenseNet201) to provide transfer learning to the model. The model then uses a deep neural network to finalize the feature extraction and predict the diagnosis for the input image. This model was trained on 4035 images and validated on 807 separate images from the ones used for training. The images used to train the classification model include an important feature: the pictures are cropped beforehand to eliminate distractions when training the model. The image segmentation model uses an improved U-Net architecture. This model is used to extract the lung mask from the chest X-ray image. The model is trained on 8577 images and validated on a validation split of 20%. These models are calculated using the external dataset for validation. The models’ accuracy, precision, recall, f1-score, IOU, and loss are calculated. Results The classification model achieved an accuracy of 97.65% and a loss of 0.1234 when differentiating COVID19-infected, pneumonia-infected, and normal lung X-rays. The segmentation model achieved an accuracy of 97.31% and an IOU of 0.928. Conclusion The models proposed can detect COVID-19, pneumonia, and normal lungs with high accuracy and derive the lung mask from a chest X-ray with similarly high accuracy. The hope is for these models to elevate the experience of medical professionals and provide insight into the future of the methods used.Keywords: artificial intelligence, convolutional neural networks, deeplearning, image processing, machine learningSarapin, intraarticular, chronic knee pain, osteoarthritisFNS, trauma, hip, neck femur fracture, minimally invasive surgery
Procedia PDF Downloads 733064 Students with Severe Learning Disabilities in Mainstream Classes: A Study of Comprehensions amongst School Staff and Parents Built on Observations and Interviews in a Phenomenological Framework
Authors: Inger Eriksson, Lisbeth Ohlsson, Jeremias Rosenqvist
Abstract:
Ingress: Focus in the study is directed towards phenomena and concepts of segregation, integration, and inclusion of students attending a special school form in Sweden, namely compulsory school for pupils with learning disabilities (in Swedish 'särskola') as an alternative to mainstream compulsory school. Aim: The aim of the study is to examine the school situation for students attending särskola from a historical perspective focussing the 1980s, 1990s and the 21st century, from an integration perspective, and from a perspective of power. Procedure: Five sub-studies are reported, where integration and inclusion are looked into by observation studies and interviews with school leaders, teachers, special and remedial teachers, psychologists, coordinators, and parents in the special schools/särskola. In brief, the study about special school students attending mainstream classes from 1998 takes its point of departure in the idea that all knowledge development takes place in a social context. A special interest is taken in the school’s role for integration generally, and the role of special education particularly and on whose conditions the integration is taking place – the special school students' or the other students,' or may be equally, in the class. Pedagogical and social conditions for so called individually integrated special school students in elementary school classes were studied in eleven classes. Results: The findings are interpreted in a power perspective supported by Foucault and relationally by Vygotsky. The main part of the data consists of extensive descriptions of the eleven cases, here called integration situations. Conclusions: In summary, this study suggests that the possibilities for a special school student to get into the class community and fellowship and thereby be integrated with the class are to a high degree dependant on to what extent the student can take part in the pedagogical processes. The pedagogical situation for the special school student is affected not only by the class teacher and the support and measures undertaken but also by the other students in the class as they, in turn, are affected by how the special school student is acting. This mutual impact, which constitutes the integration process in itself, might result in a true integration if the special school student attains the status of being accepted on his/her own terms not only being cared for or cherished by some classmates. A special school student who is not accepted even on the terms of the class will often experience severe problems in the contacts with classmates and the school situation might thus be a mere placement.Keywords: integration/inclusion, mainstream school, power, special school students
Procedia PDF Downloads 2483063 Comparison of the Cyclic Fatigue Resistance of Endoart Gold, Endoart Blue, Protaper Universal, and Protaper Gold Files at Body Temperature
Authors: Ayhan Eymirli, Sila N. Usta
Abstract:
The aim of this study is the comparison of the cyclic fatigue resistance of EndoArt Gold (EAG, Inci Dental, Istanbul, Turkey), EndoArt Blue (EAB, Inci Dental, Istanbul, Turkey), ProTaper Universal (PTU, Dentsply Tulsa Dental Specialties), and ProTaper Gold (PTG, Dentsply Tulsa Dental Specialties) files at body temperature. Twelve instruments of each EAG, EAB, PTU, PTG file system were included in this study. All selected files were rotated in the artificial canals, which have a 60° angle and a 5-mm radius of curvature until fracture occurred. The time to fracture (Ttf) was measured in seconds by a chronometer in the control panel that presents in the cyclic fatigue testing device when a fracture was detected visually and/or audibly. The lengths of the fractured fragments (FL) were also measured with a digital microcaliper. The data of Ttf and FL were analyzed using Kruskal-Wallis, one-way ANOVA and post hoc Bonferroni tests at the 5% significance level. There was a statistically significant difference among the file systems (p < 0.05). EAB had the statistically highest fatigue resistance, and PTU had the statistically lowest fatigue resistance (p < 0.05). PTG system had a statistically higher FL means than EAB and PTU file systems (p < 0.05). EAB had the greatest cyclic fatigue resistance amongst the other file systems. It can be stated that heat treatments may be a factor that increases fatigue resistance.Keywords: cyclic fatigue resistance, Endo art blue, Endo art gold, pro taper gold, pro taper universal
Procedia PDF Downloads 1263062 Educational Innovation and ICT: Before and during 21st Century
Authors: Carlos Monge López, Patricia Gómez Hernández
Abstract:
Educational innovation is a quality factor of teaching-learning processes and institutional accreditation. There is an increasing of these change processes, especially after 2000. However, the publications about this topic are more associated with ICTs in currently century. The main aim of the study was to determine the tendency of educational innovations around ICTs. The used method was mixed research design (content analysis, review of scientific literature and descriptive, comparative and correlation study) with 649 papers. In summary, the results indicated that, progressively, the educational innovation is associated with ICTs, in comparison with this type of change processes without ICTs. In conclusion, although this tendency, scientific literature must divulgate more kinds of pedagogical innovation with the aim of deepening in other new resources.Keywords: descriptive study, knowledge society, pedagogical innovation, technologies
Procedia PDF Downloads 4853061 Kinoform Optimisation Using Gerchberg- Saxton Iterative Algorithm
Authors: M. Al-Shamery, R. Young, P. Birch, C. Chatwin
Abstract:
Computer Generated Holography (CGH) is employed to create digitally defined coherent wavefronts. A CGH can be created by using different techniques such as by using a detour-phase technique or by direct phase modulation to create a kinoform. The detour-phase technique was one of the first techniques that was used to generate holograms digitally. The disadvantage of this technique is that the reconstructed image often has poor quality due to the limited dynamic range it is possible to record using a medium with reasonable spatial resolution.. The kinoform (phase-only hologram) is an alternative technique. In this method, the phase of the original wavefront is recorded but the amplitude is constrained to be constant. The original object does not need to exist physically and so the kinoform can be used to reconstruct an almost arbitrary wavefront. However, the image reconstructed by this technique contains high levels of noise and is not identical to the reference image. To improve the reconstruction quality of the kinoform, iterative techniques such as the Gerchberg-Saxton algorithm (GS) are employed. In this paper the GS algorithm is described for the optimisation of a kinoform used for the reconstruction of a complex wavefront. Iterations of the GS algorithm are applied to determine the phase at a plane (with known amplitude distribution which is often taken as uniform), that satisfies given phase and amplitude constraints in a corresponding Fourier plane. The GS algorithm can be used in this way to enhance the reconstruction quality of the kinoform. Different images are employed as the reference object and their kinoform is synthesised using the GS algorithm. The quality of the reconstructed images is quantified to demonstrate the enhanced reconstruction quality achieved by using this method.Keywords: computer generated holography, digital holography, Gerchberg-Saxton algorithm, kinoform
Procedia PDF Downloads 5333060 Bacterial Recovery of Copper Ores
Authors: Zh. Karaulova, D. Baizhigitov
Abstract:
At the Aktogay deposit, the oxidized ore section has been developed since 2015; by now, the reserves of easily enriched ore are decreasing, and a large number of copper-poor, difficult-to-enrich ores has been accumulated in the dumps of the KAZ Minerals Aktogay deposit, which is unprofitable to mine using the traditional mining methods. Hence, another technology needs to be implemented, which will significantly expand the raw material base of copper production in Kazakhstan and ensure the efficient use of natural resources. Heap and dump bacterial recovery are the most acceptable technologies for processing low-grade secondary copper sulfide ores. Test objects were the copper ores of Aktogay deposit and chemolithotrophic bacteria Leptospirillum ferrooxidans (L.f.), Acidithiobacillus caldus (A.c.), Sulfobacillus Acidophilus (S.a.), which are mixed cultures were both used in bacterial oxidation systems. They can stay active in the 20-400C temperature range. These bacteria were the most extensively studied and widely used in sulfide mineral recovery technology. Biocatalytic acceleration was achieved as a result of bacteria oxidizing iron sulfides to form iron sulfate, which subsequently underwent chemical oxidation to become sulfate oxide. The following results have been achieved at the initial stage: the goal was to grow and maintain the life activity of bacterial cultures under laboratory conditions. These bacteria grew the best within the pH 1,2-1,8 range with light stirring and in an aerated environment. The optimal growth temperature was 30-33оC. The growth rate decreased by one-half for each 4-5°C fall in temperature from 30°C. At best, the number of bacteria doubled every 24 hours. Typically, the maximum concentration of cells that can be grown in ferrous solution is about 107/ml. A further step researched in this case was the adaptation of microorganisms to the environment of certain metals. This was followed by mass production of inoculum and maintenance for their further cultivation on a factory scale. This was done by adding sulfide concentrate, allowing the bacteria to convert the ferrous sulfate as indicated by the Eh (>600 mV), then diluting to double the volume and adding concentrate to achieve the same metal level. This process was repeated until the desired metal level and volumes were achieved. The final stage of bacterial recovery was the transportation and irrigation of secondary sulfide copper ores of the oxidized ore section. In conclusion, the project was implemented at the Aktogay mine since the bioleaching process was prolonged. Besides, the method of bacterial recovery might compete well with existing non-biological methods of extraction of metals from ores.Keywords: bacterial recovery, copper ore, bioleaching, bacterial inoculum
Procedia PDF Downloads 753059 Augmented Reality: New Relations with the Architectural Heritage Education
Authors: Carla Maria Furuno Rimkus
Abstract:
The technologies related to virtual reality and augmented reality in combination with mobile technologies, are being more consolidated and used each day. The increasing technological availability along with the decrease of their acquisition and maintenance costs, have favored the expansion of its use in the field of historic heritage. In this context it is focused, in this article, on the potential of mobile applications in the dissemination of the architectural heritage, using the technology of Augmented Reality. From this perspective approach, it is discussed about the process of producing an application for mobile devices on the Android platform, which combines the technologies of geometric modeling with augmented reality (AR) and access to interactive multimedia contents with cultural, social and historic information of the historic building that we take as the object of study: a block with a set of buildings built in the XVIII century, known as "Quarteirão dos Trapiches", which was modeled in 3D, coated with the original texture of its facades and displayed on AR. From this perspective approach, this paper discusses about methodological aspects of the development of this application regarding to the process and the project development tools, and presents our considerations on methodological aspects of developing an application for the Android system, focused on the dissemination of the architectural heritage, in order to encourage the tourist potential of the city in a sustainable way and to contribute to develop the digital documentation of the heritage of the city, meeting a demand of tourists visiting the city and the professionals who work in the preservation and restoration of it, consisting of architects, historians, archaeologists, museum specialists, among others.Keywords: augmented reality, architectural heritage, geometric modeling, mobile applications
Procedia PDF Downloads 4783058 Introducing Principles of Land Surveying by Assigning a Practical Project
Authors: Introducing Principles of Land Surveying by Assigning a Practical Project
Abstract:
A practical project is used in an engineering surveying course to expose sophomore and junior civil engineering students to several important issues related to the use of basic principles of land surveying. The project, which is the design of a two-lane rural highway to connect between two arbitrary points, requires students to draw the profile of the proposed highway along with the existing ground level. Areas of all cross-sections are then computed to enable quantity computations between them. Lastly, Mass-Haul Diagram is drawn with all important parts and features shown on it for clarity. At the beginning, students faced challenges getting started on the project. They had to spend time and effort thinking of the best way to proceed and how the work would flow. It was even more challenging when they had to visualize images of cut, fill and mixed cross sections in three dimensions before they can draw them to complete the necessary computations. These difficulties were then somewhat overcome with the help of the instructor and thorough discussions among team members and/or between different teams. The method of assessment used in this study was a well-prepared-end-of-semester questionnaire distributed to students after the completion of the project and the final exam. The survey contained a wide spectrum of questions from students' learning experience when this course development was implemented to students' satisfaction of the class instructions provided to them and the instructor's competency in presenting the material and helping with the project. It also covered the adequacy of the project to show a sample of a real-life civil engineering application and if there is any excitement added by implementing this idea. At the end of the questionnaire, students had the chance to provide their constructive comments and suggestions for future improvements of the land surveying course. Outcomes will be presented graphically and in a tabular format. Graphs provide visual explanation of the results and tables, on the other hand, summarize numerical values for each student along with some descriptive statistics, such as the mean, standard deviation, and coefficient of variation for each student and each question as well. In addition to gaining experience in teamwork, communications, and customer relations, students felt the benefit of assigning such a project. They noticed the beauty of the practical side of civil engineering work and how theories are utilized in real-life engineering applications. It was even recommended by students that such a project be exercised every time this course is offered so future students can have the same learning opportunity they had.Keywords: land surveying, highway project, assessment, evaluation, descriptive statistics
Procedia PDF Downloads 2293057 Investigating the Influence of Activation Functions on Image Classification Accuracy via Deep Convolutional Neural Network
Authors: Gulfam Haider, sana danish
Abstract:
Convolutional Neural Networks (CNNs) have emerged as powerful tools for image classification, and the choice of optimizers profoundly affects their performance. The study of optimizers and their adaptations remains a topic of significant importance in machine learning research. While numerous studies have explored and advocated for various optimizers, the efficacy of these optimization techniques is still subject to scrutiny. This work aims to address the challenges surrounding the effectiveness of optimizers by conducting a comprehensive analysis and evaluation. The primary focus of this investigation lies in examining the performance of different optimizers when employed in conjunction with the popular activation function, Rectified Linear Unit (ReLU). By incorporating ReLU, known for its favorable properties in prior research, the aim is to bolster the effectiveness of the optimizers under scrutiny. Specifically, we evaluate the adjustment of these optimizers with both the original Softmax activation function and the modified ReLU activation function, carefully assessing their impact on overall performance. To achieve this, a series of experiments are conducted using a well-established benchmark dataset for image classification tasks, namely the Canadian Institute for Advanced Research dataset (CIFAR-10). The selected optimizers for investigation encompass a range of prominent algorithms, including Adam, Root Mean Squared Propagation (RMSprop), Adaptive Learning Rate Method (Adadelta), Adaptive Gradient Algorithm (Adagrad), and Stochastic Gradient Descent (SGD). The performance analysis encompasses a comprehensive evaluation of the classification accuracy, convergence speed, and robustness of the CNN models trained with each optimizer. Through rigorous experimentation and meticulous assessment, we discern the strengths and weaknesses of the different optimization techniques, providing valuable insights into their suitability for image classification tasks. By conducting this in-depth study, we contribute to the existing body of knowledge surrounding optimizers in CNNs, shedding light on their performance characteristics for image classification. The findings gleaned from this research serve to guide researchers and practitioners in making informed decisions when selecting optimizers and activation functions, thus advancing the state-of-the-art in the field of image classification with convolutional neural networks.Keywords: deep neural network, optimizers, RMsprop, ReLU, stochastic gradient descent
Procedia PDF Downloads 1253056 Emerging Cyber Threats and Cognitive Vulnerabilities: Cyberterrorism
Authors: Oludare Isaac Abiodun, Esther Omolara Abiodun
Abstract:
The purpose of this paper is to demonstrate that cyberterrorism is existing and poses a threat to computer security and national security. Nowadays, people have become excitedly dependent upon computers, phones, the Internet, and the Internet of things systems to share information, communicate, conduct a search, etc. However, these network systems are at risk from a different source that is known and unknown. These network systems risk being caused by some malicious individuals, groups, organizations, or governments, they take advantage of vulnerabilities in the computer system to hawk sensitive information from people, organizations, or governments. In doing so, they are engaging themselves in computer threats, crime, and terrorism, thereby making the use of computers insecure for others. The threat of cyberterrorism is of various forms and ranges from one country to another country. These threats include disrupting communications and information, stealing data, destroying data, leaking, and breaching data, interfering with messages and networks, and in some cases, demanding financial rewards for stolen data. Hence, this study identifies many ways that cyberterrorists utilize the Internet as a tool to advance their malicious mission, which negatively affects computer security and safety. One could identify causes for disparate anomaly behaviors and the theoretical, ideological, and current forms of the likelihood of cyberterrorism. Therefore, for a countermeasure, this paper proposes the use of previous and current computer security models as found in the literature to help in countering cyberterrorismKeywords: cyberterrorism, computer security, information, internet, terrorism, threat, digital forensic solution
Procedia PDF Downloads 963055 Deep Learning for Qualitative and Quantitative Grain Quality Analysis Using Hyperspectral Imaging
Authors: Ole-Christian Galbo Engstrøm, Erik Schou Dreier, Birthe Møller Jespersen, Kim Steenstrup Pedersen
Abstract:
Grain quality analysis is a multi-parameterized problem that includes a variety of qualitative and quantitative parameters such as grain type classification, damage type classification, and nutrient regression. Currently, these parameters require human inspection, a multitude of instruments employing a variety of sensor technologies, and predictive model types or destructive and slow chemical analysis. This paper investigates the feasibility of applying near-infrared hyperspectral imaging (NIR-HSI) to grain quality analysis. For this study two datasets of NIR hyperspectral images in the wavelength range of 900 nm - 1700 nm have been used. Both datasets contain images of sparsely and densely packed grain kernels. The first dataset contains ~87,000 image crops of bulk wheat samples from 63 harvests where protein value has been determined by the FOSS Infratec NOVA which is the golden industry standard for protein content estimation in bulk samples of cereal grain. The second dataset consists of ~28,000 image crops of bulk grain kernels from seven different wheat varieties and a single rye variety. In the first dataset, protein regression analysis is the problem to solve while variety classification analysis is the problem to solve in the second dataset. Deep convolutional neural networks (CNNs) have the potential to utilize spatio-spectral correlations within a hyperspectral image to simultaneously estimate the qualitative and quantitative parameters. CNNs can autonomously derive meaningful representations of the input data reducing the need for advanced preprocessing techniques required for classical chemometric model types such as artificial neural networks (ANNs) and partial least-squares regression (PLS-R). A comparison between different CNN architectures utilizing 2D and 3D convolution is conducted. These results are compared to the performance of ANNs and PLS-R. Additionally, a variety of preprocessing techniques from image analysis and chemometrics are tested. These include centering, scaling, standard normal variate (SNV), Savitzky-Golay (SG) filtering, and detrending. The results indicate that the combination of NIR-HSI and CNNs has the potential to be the foundation for an automatic system unifying qualitative and quantitative grain quality analysis within a single sensor technology and predictive model type.Keywords: deep learning, grain analysis, hyperspectral imaging, preprocessing techniques
Procedia PDF Downloads 993054 The Effect of Mood and Creativity on Product Creativity: Using LEGO as a Hands-On Activity
Authors: Kaewmart Pongakkasira
Abstract:
This study examines whether construction of LEGO reflects affective states and creativity as the clue to develop effective learning resources for classrooms. For this purpose, participants are instructed to complete a hands-on activity by using LEGO. Prior to the experiment, participants’ affective states and creativity are measured by the Positive and Negative Affect Schedule (PANAS) and the Alternate Uses Task (AUT), respectively. Then, subjects are asked to freely combine LEGO as unusual as possible versus constraint LEGO combination and named the LEGO products. Creativity of the LEGO products is scored for originality and abstractness of titles. It is hypothesized that individuals’ mood and creativity may affect product creativity. If so, there might be correlation among the three parameters.Keywords: affective states, creativity, hands-on activity, LEGO
Procedia PDF Downloads 373