Search results for: organismic integration theory of well-being and learning
11953 Automatic Detection Of Diabetic Retinopathy
Authors: Zaoui Ismahene, Bahri Sidi Mohamed, Abbassa Nadira
Abstract:
Diabetic Retinopathy (DR) is a leading cause of vision impairment and blindness among individuals with diabetes. Early diagnosis is crucial for effective treatment, yet current diagnostic methods rely heavily on manual analysis of retinal images, which can be time-consuming and prone to subjectivity. This research proposes an automated system for the detection of DR using Jacobi wavelet-based feature extraction combined with Support Vector Machines (SVM) for classification. The integration of wavelet analysis with machine learning techniques aims to improve the accuracy, efficiency, and reliability of DR diagnosis. In this study, retinal images are preprocessed through normalization, resizing, and noise reduction to enhance the quality of the images. The Jacobi wavelet transform is then applied to extract both global and local features, effectively capturing subtle variations in retinal images that are indicative of DR. These extracted features are fed into an SVM classifier, known for its robustness in handling high-dimensional data and its ability to achieve high classification accuracy. The SVM classifier is optimized using parameter tuning to improve performance. The proposed methodology is evaluated using a comprehensive dataset of retinal images, encompassing a range of DR severity levels. The results show that the proposed system outperforms traditional wavelet-based methods, demonstrating significantly higher accuracy, sensitivity, and specificity in detecting DR. By leveraging the discriminative power of Jacobi wavelet features and the robustness of SVM, the system provides a promising solution for the automatic detection of DR, which could assist ophthalmologists in early diagnosis and intervention, ultimately improving patient outcomes. This research highlights the potential of combining wavelet-based image processing with machine learning for advancing automated medical diagnostics.Keywords: iabetic retinopathy (DR), Jacobi wavelets, machine learning, feature extraction, classification
Procedia PDF Downloads 1411952 Image Segmentation: New Methods
Authors: Flaurence Benjamain, Michel Casperance
Abstract:
We present in this paper, first, a comparative study of three mathematical theories to achieve the fusion of information sources. This study aims to identify the characteristics inherent in theories of possibilities, belief functions (DST) and plausible and paradoxical reasoning to establish a strategy of choice that allows us to adopt the most appropriate theory to solve a problem of fusion in order, taking into account the acquired information and imperfections that accompany them. Using the new theory of plausible and paradoxical reasoning, also called Dezert-Smarandache Theory (DSmT), to fuse information multi-sources needs, at first step, the generation of the composites events witch is, in general, difficult. Thus, we present in this paper a new approach to construct pertinent paradoxical classes based on gray levels histograms, which also allows to reduce the cardinality of the hyper-powerset. Secondly, we developed a new technique for order and coding generalized focal elements. This method is exploited, in particular, to calculate the cardinality of Dezert and Smarandache. Then, we give an experimentation of classification of a remote sensing image that illustrates the given methods and we compared the result obtained by the DSmT with that resulting from the use of the DST and theory of possibilities.Keywords: segmentation, image, approach, vision computing
Procedia PDF Downloads 27911951 The Diffusion of Telehealth: System-Level Conditions for Successful Adoption
Authors: Danika Tynes
Abstract:
Telehealth is a promising advancement in health care, though there are certain conditions under which telehealth has a greater chance of success. This research sought to further the understanding of what conditions compel the success of telehealth adoption at the systems level applying Diffusion of Innovations (DoI) theory (Rogers, 1962). System-level indicators were selected to represent four components of DoI theory (relative advantage, compatibility, complexity, and observability) and regressed on 5 types of telehealth (teleradiology, teledermatology, telepathology, telepsychology, and remote monitoring) using multiple logistic regression. The analyses supported relative advantage and compatibility as the strongest influencers of telehealth adoption, remote monitoring in particular. These findings help to quantitatively clarify the factors influencing the adoption of innovation and advance the ability to make recommendations on the viability of state telehealth adoption. In addition, results indicate when DoI theory is most applicable to the understanding of telehealth diffusion. Ultimately, this research may contribute to more focused allocation of scarce health care resources through consideration of existing state conditions available foster innovation.Keywords: adoption, diffusion of innovation theory, remote monitoring, system-level indicators
Procedia PDF Downloads 14011950 Microgrid Design Under Optimal Control With Batch Reinforcement Learning
Authors: Valentin Père, Mathieu Milhé, Fabien Baillon, Jean-Louis Dirion
Abstract:
Microgrids offer potential solutions to meet the need for local grid stability and increase isolated networks autonomy with the integration of intermittent renewable energy production and storage facilities. In such a context, sizing production and storage for a given network is a complex task, highly depending on input data such as power load profile and renewable resource availability. This work aims at developing an operating cost computation methodology for different microgrid designs based on the use of deep reinforcement learning (RL) algorithms to tackle the optimal operation problem in stochastic environments. RL is a data-based sequential decision control method based on Markov decision processes that enable the consideration of random variables for control at a chosen time scale. Agents trained via RL constitute a promising class of Energy Management Systems (EMS) for the operation of microgrids with energy storage. Microgrid sizing (or design) is generally performed by minimizing investment costs and operational costs arising from the EMS behavior. The latter might include economic aspects (power purchase, facilities aging), social aspects (load curtailment), and ecological aspects (carbon emissions). Sizing variables are related to major constraints on the optimal operation of the network by the EMS. In this work, an islanded mode microgrid is considered. Renewable generation is done with photovoltaic panels; an electrochemical battery ensures short-term electricity storage. The controllable unit is a hydrogen tank that is used as a long-term storage unit. The proposed approach focus on the transfer of agent learning for the near-optimal operating cost approximation with deep RL for each microgrid size. Like most data-based algorithms, the training step in RL leads to important computer time. The objective of this work is thus to study the potential of Batch-Constrained Q-learning (BCQ) for the optimal sizing of microgrids and especially to reduce the computation time of operating cost estimation in several microgrid configurations. BCQ is an off-line RL algorithm that is known to be data efficient and can learn better policies than on-line RL algorithms on the same buffer. The general idea is to use the learned policy of agents trained in similar environments to constitute a buffer. The latter is used to train BCQ, and thus the agent learning can be performed without update during interaction sampling. A comparison between online RL and the presented method is performed based on the score by environment and on the computation time.Keywords: batch-constrained reinforcement learning, control, design, optimal
Procedia PDF Downloads 12911949 Machine learning Assisted Selective Emitter design for Solar Thermophotovoltaic System
Authors: Ambali Alade Odebowale, Andargachew Mekonnen Berhe, Haroldo T. Hattori, Andrey E. Miroshnichenko
Abstract:
Solar thermophotovoltaic systems (STPV) have emerged as a promising solution to overcome the Shockley-Queisser limit, a significant impediment in the direct conversion of solar radiation into electricity using conventional solar cells. The STPV system comprises essential components such as an optical concentrator, selective emitter, and a thermophotovoltaic (TPV) cell. The pivotal element in achieving high efficiency in an STPV system lies in the design of a spectrally selective emitter or absorber. Traditional methods for designing and optimizing selective emitters are often time-consuming and may not yield highly selective emitters, posing a challenge to the overall system performance. In recent years, the application of machine learning techniques in various scientific disciplines has demonstrated significant advantages. This paper proposes a novel nanostructure composed of four-layered materials (SiC/W/SiO2/W) to function as a selective emitter in the energy conversion process of an STPV system. Unlike conventional approaches widely adopted by researchers, this study employs a machine learning-based approach for the design and optimization of the selective emitter. Specifically, a random forest algorithm (RFA) is employed for the design of the selective emitter, while the optimization process is executed using genetic algorithms. This innovative methodology holds promise in addressing the challenges posed by traditional methods, offering a more efficient and streamlined approach to selective emitter design. The utilization of a machine learning approach brings several advantages to the design and optimization of a selective emitter within the STPV system. Machine learning algorithms, such as the random forest algorithm, have the capability to analyze complex datasets and identify intricate patterns that may not be apparent through traditional methods. This allows for a more comprehensive exploration of the design space, potentially leading to highly efficient emitter configurations. Moreover, the application of genetic algorithms in the optimization process enhances the adaptability and efficiency of the overall system. Genetic algorithms mimic the principles of natural selection, enabling the exploration of a diverse range of emitter configurations and facilitating the identification of optimal solutions. This not only accelerates the design and optimization process but also increases the likelihood of discovering configurations that exhibit superior performance compared to traditional methods. In conclusion, the integration of machine learning techniques in the design and optimization of a selective emitter for solar thermophotovoltaic systems represents a groundbreaking approach. This innovative methodology not only addresses the limitations of traditional methods but also holds the potential to significantly improve the overall performance of STPV systems, paving the way for enhanced solar energy conversion efficiency.Keywords: emitter, genetic algorithm, radiation, random forest, thermophotovoltaic
Procedia PDF Downloads 6611948 Are Some Languages Harder to Learn and Teach Than Others?
Authors: David S. Rosenstein
Abstract:
The author believes that modern spoken languages should be equally difficult (or easy) to learn, since all normal children learning their native languages do so at approximately the same rate and with the same competence, progressing from easy to more complex grammar and syntax in the same way. Why then, do some languages seem more difficult than others? Perhaps people are referring to the written language, where it may be true that mastering Chinese requires more time than French, which in turn requires more time than Spanish. But this may be marginal, since Chinese and French children quickly catch up to their Spanish peers in reading comprehension. Rather, the real differences in difficulty derive from two sources: hardened L1 language habits trying to cope with contrasting L2 habits; and unfamiliarity with unique L2 characteristics causing faulty expectations. It would seem that effective L2 teaching and learning must take these two sources of difficulty into consideration. The author feels that the latter (faulty expectations) causes the greatest difficulty, making effective teaching and learning somewhat different for each given foreign language. Examples from Chinese and other languages are presented.Keywords: learning different languages, language learning difficulties, faulty language expectations
Procedia PDF Downloads 53711947 Historical Analysis of the Evolution of Swiss Identity and the Successful Integration of Multilingualism into the Swiss Concept of Nationhood
Authors: James Beringer
Abstract:
Switzerland’s ability to forge a strong national identity across linguistic barriers has long been of interest to nationalism scholars. This begs the question of how this has been achieved, given that traditional explanations of luck or exceptionalism appear highly reductionist. This paper evaluates the theory that successful Swiss management of linguistic diversity stems from the strong integration of multilingualism into Swiss national identity. Using archival analysis of Swiss government records, historical accounts of prominent Swiss citizens, as well as secondary literature concerning the fundamental aspects of Swiss national identity, this paper charts the historical evolution of Swiss national identity. It explains how multilingualism was deliberately and successfully integrated into Swiss national identity as a response to political fragmentation along linguistic lines during the First World War. Its primary conclusions are the following. Firstly, the earliest foundations of Swiss national identity were purposefully removed from any association with a single national language. This produced symbols, myths, and values -such as a strong commitment to communalism, the imagery of the Swiss natural landscape, and the use of Latin expressions, which can be adopted across Swiss linguistic groups. Secondly, the First World War triggered a turning point in the evolution of Swiss national identity. The fundamental building blocks proved insufficient in preventing political fractures amongst linguistic lines, as each Swiss linguistic group gravitated towards its linguistic neighbours within Europe. To avoid a repeat of such fragmentation, a deliberate effort was made to fully integrate multilingualism as a fundamental aspect of Swiss national identity. Existing natural symbols, such as the St Gotthard Mountains, were recontextualized in order to become associated with multilingualism. The education system was similarly reformed to reflect the unique multilingual nature of the Swiss nation. The successful result of this process can be readily observed in polls and surveys, with large segments of the Swiss population highlighting multilingualism as a uniquely Swiss characteristic, indicating the symbiotic connection between multilingualism and the Swiss nation.Keywords: language's role in identity formation, multilingualism in nationalism, national identity formation, Swiss national identity history
Procedia PDF Downloads 19411946 Literature Review: Adversarial Machine Learning Defense in Malware Detection
Authors: Leidy M. Aldana, Jorge E. Camargo
Abstract:
Adversarial Machine Learning has gained importance in recent years as Cybersecurity has gained too, especially malware, it has affected different entities and people in recent years. This paper shows a literature review about defense methods created to prevent adversarial machine learning attacks, firstable it shows an introduction about the context and the description of some terms, in the results section some of the attacks are described, focusing on detecting adversarial examples before coming to the machine learning algorithm and showing other categories that exist in defense. A method with five steps is proposed in the method section in order to define a way to make the literature review; in addition, this paper summarizes the contributions in this research field in the last seven years to identify research directions in this area. About the findings, the category with least quantity of challenges in defense is the Detection of adversarial examples being this one a viable research route with the adaptive approach in attack and defense.Keywords: Malware, adversarial, machine learning, defense, attack
Procedia PDF Downloads 7611945 Integration of FMEA and Human Factor in the Food Chain Risk Assessment
Authors: Mohsen Shirani, Micaela Demichela
Abstract:
During the last decades, a number of food crises such as Bovine Spongiform Encephalopathy (BSE), Mad-Cow disease, Dioxin in chicken food, Food-and-Mouth Disease (FMD), have certainly inflicted the reliability of the food industry. Consequently, the trend in applying different scientific methods of risk assessment in food safety has obtained more attentions in the academic and practice. However, lack of practical approach considering entire food supply chain is tangible in the academic literature. In this regard, this paper aims to apply risk assessment tool (FMEA) with integration of Human Factor along the entire supply chain of food production and test the method in a case study of Diary production, and analyze its results.Keywords: FMEA, food supply chain, risk assessment, human factor
Procedia PDF Downloads 45111944 The Life-Cycle Theory of Dividends: Evidence from Indonesia
Authors: Vashti Carissa
Abstract:
The main objective of this study is to examine whether the life-cycle theory of dividends could explain the determinant of an optimal dividend policy in Indonesia. The sample that was used consists of 1,420 non-financial and non-trade, services, investment firms listed in Indonesian Stock Exchange during the period of 2005-2014. According to this finding using logistic regression, firm life-cycle measured by retained earnings as a proportion of total equity (RETE) significantly has a positive effect on the propensity of a firm pays dividend. The higher company’s earned surplus portion in its capital structure could reflect firm maturity level which will increase the likelihood of dividend payment in mature firms. This result provides an additional empirical evidence about the existence of life-cycle theory of dividends for dividend payout phenomenon in Indonesia. It can be known that dividends tend to be paid by mature firms while retention is more dominating in growth firms. From the testing results, it can also be known that majority of sample firms are being in the growth phase which proves the fact about infrequent dividend distribution in Indonesia during the ten years observation period.Keywords: dividend, dividend policy, life-cycle theory of dividends, mix of earned and contributed capital
Procedia PDF Downloads 29411943 Social Inclusion Challenges in Indigenous Communities: Case of the Baka Pygmies Community of Cameroon
Authors: Igor Michel Gachig, Samanta Tiague
Abstract:
Baka ‘Pygmies’ is an indigenous community living in the rainforest region of Cameroon. This community is known to be poor and marginalized from the political, economic and social life, regardless of sedentarization and development efforts. In fact, the social exclusion of ‘Pygmy’ people prevents them from gaining basic citizen’s rights, among which access to education, land, healthcare, employment and justice. In this study, social interactions, behaviors, and perceptions were considered. An interview guide and focus group discussions were used to collect data. A sample size of 97 was used, with 60 Baka Pygmies and 37 Bantus from two Baka-Bantu settlements/villages of the south region of Cameroon. The data were classified in terms of homogenous, exhaustive and exclusive categories. This classification has enabled factors explaining social exclusion in the Baka community to be highlighted using content analysis. The study shows that (i) limited access to education, natural resources and care in modern healthcare organizations, and (ii) different views on the development expectations and integration approaches both highlight the social exclusion in the Baka ‘Pygmies’ community. Therefore, an effective and adequate social integration of ‘Pygmies’ based on cultural peculiarities and identity, as well as reduction of disparities and improvement of their access to education should be of major concern to the government and policy makers.Keywords: development, indigenous people, integration, social exclusion
Procedia PDF Downloads 14211942 The Effects of Self-Graphing on the Reading Fluency of an Elementary Student with Learning Disabilities
Authors: Matthias Grünke
Abstract:
In this single-case study, we evaluated the effects of a self-graphing intervention to help students improve their reading fluency. Our participant was a 10-year-old girl with a suspected learning disability in reading. We applied an ABAB reversal design to test the efficacy of our approach. The dependent measure was the number of correctly read words from a children’s book within five minutes. Our participant recorded her daily performance using a simple line diagram. Results indicate that her reading rate improved simultaneously with the intervention and dropped as soon as the treatment was suspended. The findings give reasons for optimism that our simple strategy can be a very effective tool in supporting students with learning disabilities to boost their reading fluency.Keywords: single-case study, learning disabilities, elementary education, reading problems, reading fluency
Procedia PDF Downloads 11611941 Analysis of Simply Supported Beams Using Elastic Beam Theory
Authors: M. K. Dce
Abstract:
The aim of this paper is to investigate the behavior of simply supported beams having rectangular section and subjected to uniformly distributed load (UDL). In this study five beams of span 5m, 6m, 7m and 8m have been considered. The width of all the beams is 400 mm and span to depth ratio has been taken as 12. The superimposed live load has been increased from 10 kN/m to 25 kN/m at the interval of 5 kN/m. The analysis of the beams has been carried out using the elastic beam theory. On the basis of present study it has been concluded that the maximum bending moment as well as deflection occurs at the mid-span of simply supported beam and its magnitude increases in proportion to magnitude of UDL. Moreover, the study suggests that the maximum moment is proportional to square of span and maximum deflection is proportional to fourth power of span.Keywords: beam, UDL, bending moment, deflection, elastic beam theory
Procedia PDF Downloads 39311940 Virtual Science Hub: An Open Source Platform to Enrich Science Teaching
Authors: Enrique Barra, Aldo Gordillo, Juan Quemada
Abstract:
This paper presents the Virtual Science Hub platform. It is an open source platform that combines a social network, an e-learning authoring tool, a video conference service and a learning object repository for science teaching enrichment. These four main functionalities fit very well together. The platform was released in April 2012 and since then it has not stopped growing. Finally we present the results of the surveys conducted and the statistics gathered to validate this approach.Keywords: e-learning, platform, authoring tool, science teaching, educational sciences
Procedia PDF Downloads 40211939 Transdisciplinary Pedagogy: An Arts-Integrated Approach to Promote Authentic Science, Technology, Engineering, Arts, and Mathematics Education in Initial Teacher Education
Authors: Anne Marie Morrin
Abstract:
This paper will focus on the design, delivery and assessment of a transdisciplinary STEAM (Science, Technology, Engineering, Arts, and Mathematics) education initiative in a college of education in Ireland. The project explores a transdisciplinary approach to supporting STEAM education where the concepts, methodologies and assessments employed derive from visual art sessions within initial teacher education. The research will demonstrate that the STEAM Education approach is effective when visual art concepts and methods are placed at the core of the teaching and learning experience. Within this study, emphasis is placed on authentic collaboration and transdisciplinary pedagogical approaches with the STEAM subjects. The partners included a combination of teaching expertise in STEM and Visual Arts education, artists, in-service and pre-service teachers and children. The inclusion of all stakeholders mentioned moves towards a more authentic approach where transdisciplinary practice is at the core of the teaching and learning. Qualitative data was collected using a combination of questionnaires (focused and open-ended questions) and focus groups. In addition, the data was collected through video diaries where students reflected on their visual journals and transdisciplinary practice, which gave rich insight into participants' experiences and opinions on their learning. It was found that an effective program of STEAM education integration was informed by co-teaching (continuous professional development), which involved a commitment to adaptable and flexible approaches to teaching, learning, and assessment, as well as the importance of continuous reflection-in-action by all participants. The delivery of a transdisciplinary model of STEAM education was devised to reconceptualizatise how individual subject areas can develop essential skills and tackle critical issues (such as self-care and climate change) through data visualisation and technology. The success of the project can be attributed to the collaboration, which was inclusive, flexible and a willingness between various stakeholders to be involved in the design and implementation of the project from conception to completion. The case study approach taken is particularistic (focusing on the STEAM-ED project), descriptive (providing in-depth descriptions from varied and multiple perspectives), and heuristic (interpreting the participants’ experiences and what meaning they attributed to their experiences).Keywords: collaboration, transdisciplinary, STEAM, visual arts education
Procedia PDF Downloads 5311938 Exploring Students’ Self-Evaluation on Their Learning Outcomes through an Integrated Cumulative Grade Point Average Reporting Mechanism
Authors: Suriyani Ariffin, Nor Aziah Alias, Khairil Iskandar Othman, Haslinda Yusoff
Abstract:
An Integrated Cumulative Grade Point Average (iCGPA) is a mechanism and strategy to ensure the curriculum of an academic programme is constructively aligned to the expected learning outcomes and student performance based on the attainment of those learning outcomes that is reported objectively in a spider web. Much effort and time has been spent to develop a viable mechanism and trains academics to utilize the platform for reporting. The question is: How well do learners conceive the idea of their achievement via iCGPA and whether quality learner attributes have been nurtured through the iCGPA mechanism? This paper presents the architecture of an integrated CGPA mechanism purported to address a holistic evaluation from the evaluation of courses learning outcomes to aligned programme learning outcomes attainment. The paper then discusses the students’ understanding of the mechanism and evaluation of their achievement from the generated spider web. A set of questionnaires were distributed to a group of students with iCGPA reporting and frequency analysis was used to compare the perspectives of students on their performance. In addition, the questionnaire also explored how they conceive the idea of an integrated, holistic reporting and how it generates their motivation to improve. The iCGPA group was found to be receptive to what they have achieved throughout their study period. They agreed that the achievement level generated from their spider web allows them to develop intervention and enhance the programme learning outcomes before they graduate.Keywords: learning outcomes attainment, iCGPA, programme learning outcomes, spider web, iCGPA reporting skills
Procedia PDF Downloads 21111937 [Keynote Talk]: Caught in the Tractorbeam of Larger Influences: The Filtration of Innovation in Education Technology Design
Authors: Justin D. Olmanson, Fitsum Abebe, Valerie Jones, Eric Kyle, Xianquan Liu, Katherine Robbins, Guieswende Rouamba
Abstract:
The history of education technology--and designing, adapting, and adopting technologies for use in educational spaces--is nuanced, complex, and dynamic. Yet, despite a range of continually emerging technologies, the design and development process often yields results that appear quite similar in terms of affordances and interactions. Through this study we (1) verify the extent to which designs have been constrained, (2) consider what might account for it, and (3) offer a way forward in terms of how we might identify and strategically sidestep these influences--thereby increasing the diversity of our designs with a given technology or within a particular learning domain. We begin our inquiry from the perspective that a host of co-influencing elements, fields, and meta narratives converge on the education technology design process to exert a tangible, often homogenizing effect on the resultant designs. We identify several elements that influence design in often implicit or unquestioned ways (e.g. curriculum, learning theory, economics, learning context, pedagogy), we describe our methodology for identifying the elemental positionality embedded in a design, we direct our analysis to a particular subset of technologies in the field of literacy, and unpack our findings. Our early analysis suggests that the majority of education technologies designed for use/used in US public schools are heavily influenced by a handful of mainstream theories and meta narratives. These findings have implications for how we approach the education technology design process--which we use to suggest alternative methods for designing/ developing with emerging technologies. Our analytical process and re conceptualized design process hold the potential to diversify the ways emerging and established technologies get incorporated into our designs.Keywords: curriculum, design, innovation, meta narratives
Procedia PDF Downloads 51411936 Healthcare-SignNet: Advanced Video Classification for Medical Sign Language Recognition Using CNN and RNN Models
Authors: Chithra A. V., Somoshree Datta, Sandeep Nithyanandan
Abstract:
Sign Language Recognition (SLR) is the process of interpreting and translating sign language into spoken or written language using technological systems. It involves recognizing hand gestures, facial expressions, and body movements that makeup sign language communication. The primary goal of SLR is to facilitate communication between hearing- and speech-impaired communities and those who do not understand sign language. Due to the increased awareness and greater recognition of the rights and needs of the hearing- and speech-impaired community, sign language recognition has gained significant importance over the past 10 years. Technological advancements in the fields of Artificial Intelligence and Machine Learning have made it more practical and feasible to create accurate SLR systems. This paper presents a distinct approach to SLR by framing it as a video classification problem using Deep Learning (DL), whereby a combination of Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs) has been used. This research targets the integration of sign language recognition into healthcare settings, aiming to improve communication between medical professionals and patients with hearing impairments. The spatial features from each video frame are extracted using a CNN, which captures essential elements such as hand shapes, movements, and facial expressions. These features are then fed into an RNN network that learns the temporal dependencies and patterns inherent in sign language sequences. The INCLUDE dataset has been enhanced with more videos from the healthcare domain and the model is evaluated on the same. Our model achieves 91% accuracy, representing state-of-the-art performance in this domain. The results highlight the effectiveness of treating SLR as a video classification task with the CNN-RNN architecture. This approach not only improves recognition accuracy but also offers a scalable solution for real-time SLR applications, significantly advancing the field of accessible communication technologies.Keywords: sign language recognition, deep learning, convolution neural network, recurrent neural network
Procedia PDF Downloads 3511935 Unsupervised Images Generation Based on Sloan Digital Sky Survey with Deep Convolutional Generative Neural Networks
Authors: Guanghua Zhang, Fubao Wang, Weijun Duan
Abstract:
Convolution neural network (CNN) has attracted more and more attention on recent years. Especially in the field of computer vision and image classification. However, unsupervised learning with CNN has received less attention than supervised learning. In this work, we use a new powerful tool which is deep convolutional generative adversarial networks (DCGANs) to generate images from Sloan Digital Sky Survey. Training by various star and galaxy images, it shows that both the generator and the discriminator are good for unsupervised learning. In this paper, we also took several experiments to choose the best value for hyper-parameters and which could help to stabilize the training process and promise a good quality of the output.Keywords: convolution neural network, discriminator, generator, unsupervised learning
Procedia PDF Downloads 27211934 Quantifying Fatigue during Periods of Intensified Competition in Professional Ice Hockey Players: Magnitude of Fatigue in Selected Markers
Authors: Eoin Kirwan, Christopher Nulty, Declan Browne
Abstract:
The professional ice hockey season consists of approximately 60 regular season games with periods of fixture congestion occurring several times in the average season. These periods of congestion provide limited time for recovery, exposing the athletes to the risk of competing whilst not fully recovered. Although a body of research is growing with respect to monitoring fatigue, particularly during periods of congested fixtures in team sports such as rugby and soccer, it has received little to no attention thus far in ice hockey athletes. Consequently, there is limited knowledge on monitoring tools that might effectively detect a fatigue response and the magnitude of fatigue that can accumulate when recovery is limited by competitive fixtures. The benefit of quantifying and establishing fatigue status is the ability to optimise training and provide pertinent information on player health, injury risk, availability and readiness. Some commonly used methods to assess fatigue and recovery status of athletes include the use of perceived fatigue and wellbeing questionnaires, tests of muscular force and ratings of perceive exertion (RPE). These measures are widely used in popular team sports such as soccer and rugby and show promise as assessments of fatigue and recovery status for ice hockey athletes. As part of a larger study, this study explored the magnitude of changes in adductor muscle strength after game play and throughout a period of fixture congestion and examined the relationship between internal game load and perceived wellbeing with adductor muscle strength. Methods 8 professional ice hockey players from a British Elite League club volunteered to participate (age = 29.3 ± 2.49 years, height = 186.15 ± 6.75 cm, body mass = 90.85 ± 8.64 kg). Prior to and after competitive games each player performed trials of the adductor squeeze test at 0˚ hip flexion with the lead investigator using hand-held dynamometry. Rate of perceived exertion was recorded for each game and from data of total ice time individual session RPE was calculated. After each game players completed a 5- point questionnaire to assess perceived wellbeing. Data was collected from six competitive games, 1 practice and 36 hours post the final game, over a 10 – day period. Results Pending final data collection in February Conclusions Pending final data collection in February.Keywords: Conjested fixtures, fatigue monitoring, ice hockey, readiness
Procedia PDF Downloads 14811933 Combining Shallow and Deep Unsupervised Machine Learning Techniques to Detect Bad Actors in Complex Datasets
Authors: Jun Ming Moey, Zhiyaun Chen, David Nicholson
Abstract:
Bad actors are often hard to detect in data that imprints their behaviour patterns because they are comparatively rare events embedded in non-bad actor data. An unsupervised machine learning framework is applied here to detect bad actors in financial crime datasets that record millions of transactions undertaken by hundreds of actors (<0.01% bad). Specifically, the framework combines ‘shallow’ (PCA, Isolation Forest) and ‘deep’ (Autoencoder) methods to detect outlier patterns. Detection performance analysis for both the individual methods and their combination is reported.Keywords: detection, machine learning, deep learning, unsupervised, outlier analysis, data science, fraud, financial crime
Procedia PDF Downloads 10011932 Design Analysis for Declining Admission Trend in Canada Public Diploma Programs
Authors: Zulfiqar Ali
Abstract:
The current survey reports and data demonstrate a declining trend of admissions in instructor-led synchronous diploma programs in Canadian public higher education institutes. A significant impact can also be seen on various Information Technology (IT) related diploma programs in prominent Canadian higher education institutes across the country. The significant external factors that impact the students’ interests in admission in instructor-led synchronous Information Technology related diploma programs include but not limited to easy access to online learning materials provided by external competitors. The high involvement of the IT giants like Microsoft, Cisco, Google, AWS, Linux in training and certification programs through their Learning Management Systems (LMS) came with their academy’s establishment. They offer and provide very scientific advanced kind of learning and teaching resources embedded with cloud and artificial Intelligence (AI) tools, techniques and design. The other external factor is the best fit of rate of change of technology (velocity) in business vis-à-vis the rate of change of adoption and transformation of could-based Artificial Intelligence (AI) in Canadian public higher education institutes for diploma programs. The significant internal factors may include but are not limited to the legacy type of curriculum design, tools, techniques, style, and delivery. The other major contribution in declining admission trend in Canadian public higher education institute’s IT related programs.is the diversity of learning and teaching styles comes from existing hiring and immigration processes. The proposed research addresses the major contribution of both internal and external factors in declining admission trend in instructor-led synchronous diploma programs in Canadian public higher education institutes. The research approaches to be adopted for the proposed work include collecting data, filtering data, quantitative analysis, qualitative analysis and mixed approach. The focal point of this research is the contribution of major internal factors in declining admission trend including curriculum design, delivery methods, academic integrity, velocity, cloud-based AI tools, techniques and integration with existing learning management system. Finally, the research results come up with analysis-based recommendations and design to cope with challenge of declining admission trend in Canadian public higher education institutes diploma programs vis-à-vis internal and external factors.Keywords: advanced curriculum design, analysis of internal educational factors, analysis of external educational factors, educational technology
Procedia PDF Downloads 811931 Effectiveness of Active Learning in Social Science Courses at Japanese Universities
Authors: Kumiko Inagaki
Abstract:
In recent, years, Japanese universities have begun to face a dilemma: more than half of all high school graduates go on to attend an institution of higher learning, overwhelming Japanese universities accustomed to small student bodies. These universities have been forced to embrace qualitative changes to accommodate the increased number and diversity of students who enter their establishments, students who differ in their motivations for learning, their levels of eagerness to learn, and their perspectives on the future. One of these changes is an increase in awareness among Japanese educators of the importance of active learning, which deepens students’ understanding of course material through a range of activities, including writing, speaking, thinking, and presenting, in addition to conventional “passive learning” methods such as listening to a one-way lecture. The purpose of this study is to examine the effectiveness of the teaching method adapted to improve active learning. A teaching method designed to promote active learning was implemented in a social science course at one of the most popular universities in Japan. A questionnaire using a five-point response format was given to students in 2,305 courses throughout the university to evaluate the effectiveness of the method based on the following measures: ① the ratio of students who were motivated to attend the classes, ② the rate at which students learned new information, and ③ the teaching method adopted in the classes. The results of this study show that the percentage of students who attended the active learning course eagerly, and the rate of new knowledge acquired through the course, both exceeded the average for the university, the department, and the subject area of social science. In addition, there are strong correlations between teaching method and student motivation and between teaching method and knowledge acquisition rate. These results indicate that the active learning teaching method was effectively implemented and that it may improve student eagerness to attend class and motivation to learn.Keywords: active learning, Japanese university, teaching method, university education
Procedia PDF Downloads 20111930 Against the Idea of Public Power as Free Will
Authors: Donato Vese
Abstract:
According to the common interpretation, in a legal system, public powers are established by law. Exceptions are admitted in an emergency or particular relationship with public power. However, we currently agree that law allows public administration a margin of decision, even in the case of non-discretionary acts. Hence, the administrative decision not exclusively established by law becomes the rule in the ordinary state of things, non-only in state of exception. This paper aims to analyze and discuss different ideas on discretionary power on the Rule of Law and Rechtsstaat. Observing the legal literature in Europe and Nord and South America, discretionary power can be described as follow: it could be considered a margin that law accords to the executive power for political decisions or a choice between different interpretations of vague legal previsions. In essence, this explanation admits for the executive a decision not established by law or anyhow not exclusively established by law. This means that the discretionary power of public administration integrates the law. However, integrating law does not mean to decide according to the law, but it means to integrate law with a decision involving public power. Consequently, discretionary power is essentially free will. In this perspective, also the Rule of Law and the Rechtsstaat are notions explained differently. Recently, we can observe how the European notion of Rechtsstaat is founded on the formal validity of the law; therefore, for this notion, public authority’s decisions not regulated by law represent a problem. Thus, different systems of law integration have been proposed in legal literature, such as values, democracy, reasonableness, and so on. This paper aims to verify how, looking at those integration clauses from a logical viewpoint, integration based on the recourse to the legal system itself does not resolve the problem. The aforementioned integration clauses are legal rules that require hard work to explain the correct meaning of the law; in particular, they introduce dangerous criteria in favor of the political majority. A different notion of public power can be proposed. This notion includes two main features: (a) sovereignty belongs to persons and not the state, and (b) fundamental rights are not grounded but recognized by Constitutions. Hence, public power is a system based on fundamental rights. According to this approach, it can also be defined as the notion of public interest as concrete maximization of fundamental rights enjoyments. Like this, integration of the law, vague or subject to several interpretations, must be done by referring to the system of fundamental individual rights. We can think, for instance, to fundamental rights that are right in an objective view but not legal because not established by law.Keywords: administrative discretion, free will, fundamental rights, public power, sovereignty
Procedia PDF Downloads 11311929 Mentor and Mentee Based Learning
Authors: Erhan Eroğlu
Abstract:
This paper presents a new method called Mentor and Mentee Based Learning. This new method is becoming more and more common especially at workplaces. This study is significant as it clearly underlines how it works well. Education has always aimed at equipping people with the necessary knowledge and information. For many decades it went on teachers’ talk and chalk methods. In the second half of the nineteenth century educators felt the need for some changes in delivery systems. Some new terms like self- discovery, learner engagement, student centered learning, hands on learning have become more and more popular for such a long time. However, some educators believe that there is much room for better learning methods in many fields as they think the learners still cannot fulfill their potential capacities. Thus, new systems and methods are still being developed and applied at education centers and work places. One of the latest methods is assigning some mentors for the newly recruited employees and training them within a mentor and mentee program which allows both parties to see their strengths and weaknesses and the areas which can be improved. This paper aims at finding out the perceptions of the mentors and mentees on the programs they are offered at their workplaces and suggests some betterment alternatives. The study has been conducted via a qualitative method whereby some interviews have been done with both mentors and mentees separately and together. Results show that it is a great way to train inexperienced one and also to refresh the older ones. Some points to be improved have also been underlined. The paper shows that education is not a one way path to follow.Keywords: learning, mentor, mentee, training
Procedia PDF Downloads 22911928 A Case Study of Deep Learning for Disease Detection in Crops
Authors: Felipe A. Guth, Shane Ward, Kevin McDonnell
Abstract:
In the precision agriculture area, one of the main tasks is the automated detection of diseases in crops. Machine Learning algorithms have been studied in recent decades for such tasks in view of their potential for improving economic outcomes that automated disease detection may attain over crop fields. The latest generation of deep learning convolution neural networks has presented significant results in the area of image classification. In this way, this work has tested the implementation of an architecture of deep learning convolution neural network for the detection of diseases in different types of crops. A data augmentation strategy was used to meet the requirements of the algorithm implemented with a deep learning framework. Two test scenarios were deployed. The first scenario implemented a neural network under images extracted from a controlled environment while the second one took images both from the field and the controlled environment. The results evaluated the generalisation capacity of the neural networks in relation to the two types of images presented. Results yielded a general classification accuracy of 59% in scenario 1 and 96% in scenario 2.Keywords: convolutional neural networks, deep learning, disease detection, precision agriculture
Procedia PDF Downloads 26411927 Hacking the Spatial Limitations in Bridging Virtual and Traditional Teaching Methodologies in Sri Lanka
Authors: Manuela Nayantara Jeyaraj
Abstract:
Having moved into the 21st century, it is way past being arguable that innovative technology needs to be incorporated into conventional classroom teaching. Though the Western world has found presumable success in achieving this, it is still a concept under battle in developing countries such as Sri Lanka. Reaching the acme of implementing interactive virtual learning within classrooms is a struggling idealistic fascination within the island. In order to overcome this problem, this study is set to reveal facts that limit the implementation of virtual, interactive learning within the school classrooms and provide hacks that could prove the augmented use of the Virtual World to enhance teaching and learning experiences. As each classroom moves along with the usage of technology to fulfill its functionalities, a few intense hacks provided will build the administrative onuses on a virtual system. These hacks may divulge barriers based on social conventions, financial boundaries, digital literacy, intellectual capacity of the staff, and highlight the impediments in introducing students to an interactive virtual learning environment and thereby provide the necessary actions or changes to be made to succeed and march along in creating an intellectual society built on virtual learning and lifestyle. This digital learning environment will be composed of multimedia presentations, trivia and pop quizzes conducted on a GUI, assessments conducted via a virtual system, records maintained on a database, etc. The ultimate objective of this study could enhance every child's basic learning environment; hence, diminishing the digital divide that exists in certain communities.Keywords: digital divide, digital learning, digitization, Sri Lanka, teaching methodologies
Procedia PDF Downloads 35811926 Silencing the Protagonist: Gender and Rape Depiction in Pakistani Dramas
Authors: Saman R. Khan, Najma Sadiq
Abstract:
Silencing of opinions is an important aspect of Spiral of Silence theory however its applicability in rape-themed dramas requires investigation. This study focuses on the portrayal of female rape victim protagonists in Pakistani dramas and the factors influencing their behavior after rape. A quantitative content analysis was conducted on two prime-time dramas which directly dealt with female rape victims. Results indicate that the female protagonists who faced rape are shown as silent and submissive characters who are unable to communicate about their ordeal due to fear of social isolation. These findings lend support to the Spiral of Silence theory and indicate that the theory’s basic elements (inability to express opinions and fear of social isolation) exist in these TV dramas.Keywords: gender stereotyping, rape victims, the spiral of silence, TV dramas
Procedia PDF Downloads 17411925 Multi-Classification Deep Learning Model for Diagnosing Different Chest Diseases
Authors: Bandhan Dey, Muhsina Bintoon Yiasha, Gulam Sulaman Choudhury
Abstract:
Chest disease is one of the most problematic ailments in our regular life. There are many known chest diseases out there. Diagnosing them correctly plays a vital role in the process of treatment. There are many methods available explicitly developed for different chest diseases. But the most common approach for diagnosing these diseases is through X-ray. In this paper, we proposed a multi-classification deep learning model for diagnosing COVID-19, lung cancer, pneumonia, tuberculosis, and atelectasis from chest X-rays. In the present work, we used the transfer learning method for better accuracy and fast training phase. The performance of three architectures is considered: InceptionV3, VGG-16, and VGG-19. We evaluated these deep learning architectures using public digital chest x-ray datasets with six classes (i.e., COVID-19, lung cancer, pneumonia, tuberculosis, atelectasis, and normal). The experiments are conducted on six-classification, and we found that VGG16 outperforms other proposed models with an accuracy of 95%.Keywords: deep learning, image classification, X-ray images, Tensorflow, Keras, chest diseases, convolutional neural networks, multi-classification
Procedia PDF Downloads 9811924 Analysis and Simulation of TM Fields in Waveguides with Arbitrary Cross-Section Shapes by Means of Evolutionary Equations of Time-Domain Electromagnetic Theory
Authors: Ömer Aktaş, Olga A. Suvorova, Oleg Tretyakov
Abstract:
The boundary value problem on non-canonical and arbitrary shaped contour is solved with a numerically effective method called Analytical Regularization Method (ARM) to calculate propagation parameters. As a result of regularization, the equation of first kind is reduced to the infinite system of the linear algebraic equations of the second kind in the space of L2. This equation can be solved numerically for desired accuracy by using truncation method. The parameters as cut-off wavenumber and cut-off frequency are used in waveguide evolutionary equations of electromagnetic theory in time-domain to illustrate the real-valued TM fields with lossy and lossless media.Keywords: analytical regularization method, electromagnetic theory evolutionary equations of time-domain, TM Field
Procedia PDF Downloads 502