Search results for: traditional learning approach
18984 A Deep Learning Based Integrated Model For Spatial Flood Prediction
Authors: Vinayaka Gude Divya Sampath
Abstract:
The research introduces an integrated prediction model to assess the susceptibility of roads in a future flooding event. The model consists of deep learning algorithm for forecasting gauge height data and Flood Inundation Mapper (FIM) for spatial flooding. An optimal architecture for Long short-term memory network (LSTM) was identified for the gauge located on Tangipahoa River at Robert, LA. Dropout was applied to the model to evaluate the uncertainty associated with the predictions. The estimates are then used along with FIM to identify the spatial flooding. Further geoprocessing in ArcGIS provides the susceptibility values for different roads. The model was validated based on the devastating flood of August 2016. The paper discusses the challenges for generalization the methodology for other locations and also for various types of flooding. The developed model can be used by the transportation department and other emergency response organizations for effective disaster management.Keywords: deep learning, disaster management, flood prediction, urban flooding
Procedia PDF Downloads 15218983 Preliminary Study of Hand Gesture Classification in Upper-Limb Prosthetics Using Machine Learning with EMG Signals
Authors: Linghui Meng, James Atlas, Deborah Munro
Abstract:
There is an increasing demand for prosthetics capable of mimicking natural limb movements and hand gestures, but precise movement control of prosthetics using only electrode signals continues to be challenging. This study considers the implementation of machine learning as a means of improving accuracy and presents an initial investigation into hand gesture recognition using models based on electromyographic (EMG) signals. EMG signals, which capture muscle activity, are used as inputs to machine learning algorithms to improve prosthetic control accuracy, functionality and adaptivity. Using logistic regression, a machine learning classifier, this study evaluates the accuracy of classifying two hand gestures from the publicly available Ninapro dataset using two-time series feature extraction algorithms: Time Series Feature Extraction (TSFE) and Convolutional Neural Networks (CNNs). Trials were conducted using varying numbers of EMG channels from one to eight to determine the impact of channel quantity on classification accuracy. The results suggest that although both algorithms can successfully distinguish between hand gesture EMG signals, CNNs outperform TSFE in extracting useful information for both accuracy and computational efficiency. In addition, although more channels of EMG signals provide more useful information, they also require more complex and computationally intensive feature extractors and consequently do not perform as well as lower numbers of channels. The findings also underscore the potential of machine learning techniques in developing more effective and adaptive prosthetic control systems.Keywords: EMG, machine learning, prosthetic control, electromyographic prosthetics, hand gesture classification, CNN, computational neural networks, TSFE, time series feature extraction, channel count, logistic regression, ninapro, classifiers
Procedia PDF Downloads 4318982 A Quality Improvement Approach for Reducing Stigma and Discrimination against Young Key Populations in the Delivery of Sexual Reproductive Health and Rights Services
Authors: Atucungwiire Rwebiita
Abstract:
Introduction: In Uganda, provision of adolescent sexual reproductive health and rights (SRHR) services for key population is still hindered by negative attitudes, stigma and discrimination (S&D) at both the community and facility levels. To address this barrier, Integrated Community Based Initiatives (ICOBI) with support from SIDA is currently implementing a quality improvement (QI) innovative approach for strengthening the capacity of key population (KP) peer leaders and health workers to deliver friendly SRHR services without S&D. Methods: Our innovative approach involves continuous mentorship and coaching of 8 QI teams at 8 health facilities and their catchment areas. Each of the 8 teams (comprised of 5 health workers and 5 KP peer leaders) are facilitated twice a month by two QI Mentors in a 2-hour mentorship session over a period of 4 months. The QI mentors were provided a 2-weeks training on QI approaches for reducing S&D against young key populations in the delivery of SRHR Services. The mentorship sessions are guided by a manual where teams base to analyse root causes of S&D and develop key performance indicators (KPIs) in the 1st and 2nd second sessions respectively. The teams then develop action plans in the 3rd session and review implementation progress on KPIs at the end of subsequent sessions. The KPIs capture information on the attitude of health workers and peer leaders and the general service delivery setting as well as clients’ experience. A dashboard is developed to routinely track the KPIs for S&D across all the supported health facilities and catchment areas. After 4 months, QI teams share documented QI best practices and tested change packages on S&D in a learning and exchange session involving all the teams. Findings: The implementation of this approach is showing positive results. So far, QI teams have already identified the root causes of S&D against key populations including: poor information among health workers, fear of a perceived risk of infection, perceived links between HIV and disreputable behaviour. Others are perceptions that HIV & STIs are divine punishment, sex work and homosexuality are against religion and cultural values. They have also noted the perception that MSM are mentally sick and a danger to everyone. Eight QI teams have developed action plans to address the root causes of S&D. Conclusion: This approach is promising, offers a novel and scalable means to implement stigma-reduction interventions in facility and community settings.Keywords: key populations, sexual reproductive health and rights, stigma and discrimination , quality improvement approach
Procedia PDF Downloads 17918981 Intensive Use of Software in Teaching and Learning Calculus
Authors: Nodelman V.
Abstract:
Despite serious difficulties in the assimilation of the conceptual system of Calculus, software in the educational process is used only occasionally, and even then, mainly for illustration purposes. The following are a few reasons: The non-trivial nature of the studied material, Lack of skills in working with software, Fear of losing time working with software, The variety of the software itself, the corresponding interface, syntax, and the methods of working with the software, The need to find suitable models, and familiarize yourself with working with them, Incomplete compatibility of the found models with the content and teaching methods of the studied material. This paper proposes an active use of the developed non-commercial software VusuMatica, which allows removing these restrictions through Broad support for the studied mathematical material (and not only Calculus). As a result - no need to select the right software, Emphasizing the unity of mathematics, its intrasubject and interdisciplinary relations, User-friendly interface, Absence of special syntax in defining mathematical objects, Ease of building models of the studied material and manipulating them, Unlimited flexibility of models thanks to the ability to redefine objects, which allows exploring objects characteristics, and considering examples and counterexamples of the concepts under study. The construction of models is based on an original approach to the analysis of the structure of the studied concepts. Thanks to the ease of construction, students are able not only to use ready-made models but also to create them on their own and explore the material studied with their help. The presentation includes examples of using VusuMatica in studying the concepts of limit and continuity of a function, its derivative, and integral.Keywords: counterexamples, limitations and requirements, software, teaching and learning calculus, user-friendly interface and syntax
Procedia PDF Downloads 8418980 A New Approach for Assertions Processing during Assertion-Based Software Testing
Authors: Ali M. Alakeel
Abstract:
Assertion-based software testing has been shown to be a promising tool for generating test cases that reveal program faults. Because the number of assertions may be very large for industry-size programs, one of the main concerns to the applicability of assertion-based testing is the amount of search time required to explore a large number of assertions. This paper presents a new approach for assertions exploration during the process of Assertion-Based software testing. Our initial exterminations with the proposed approach show that the performance of Assertion-Based testing may be improved, therefore, making this approach more efficient when applied on programs with large number of assertions.Keywords: software testing, assertion-based testing, program assertions, generating test
Procedia PDF Downloads 46618979 Single-Camera Basketball Tracker through Pose and Semantic Feature Fusion
Authors: Adrià Arbués-Sangüesa, Coloma Ballester, Gloria Haro
Abstract:
Tracking sports players is a widely challenging scenario, specially in single-feed videos recorded in tight courts, where cluttering and occlusions cannot be avoided. This paper presents an analysis of several geometric and semantic visual features to detect and track basketball players. An ablation study is carried out and then used to remark that a robust tracker can be built with Deep Learning features, without the need of extracting contextual ones, such as proximity or color similarity, nor applying camera stabilization techniques. The presented tracker consists of: (1) a detection step, which uses a pretrained deep learning model to estimate the players pose, followed by (2) a tracking step, which leverages pose and semantic information from the output of a convolutional layer in a VGG network. Its performance is analyzed in terms of MOTA over a basketball dataset with more than 10k instances.Keywords: basketball, deep learning, feature extraction, single-camera, tracking
Procedia PDF Downloads 14218978 Towards Human-Interpretable, Automated Learning of Feedback Control for the Mixing Layer
Authors: Hao Li, Guy Y. Cornejo Maceda, Yiqing Li, Jianguo Tan, Marek Morzynski, Bernd R. Noack
Abstract:
We propose an automated analysis of the flow control behaviour from an ensemble of control laws and associated time-resolved flow snapshots. The input may be the rich database of machine learning control (MLC) optimizing a feedback law for a cost function in the plant. The proposed methodology provides (1) insights into the control landscape, which maps control laws to performance, including extrema and ridge-lines, (2) a catalogue of representative flow states and their contribution to cost function for investigated control laws and (3) visualization of the dynamics. Key enablers are classification and feature extraction methods of machine learning. The analysis is successfully applied to the stabilization of a mixing layer with sensor-based feedback driving an upstream actuator. The fluctuation energy is reduced by 26%. The control replaces unforced Kelvin-Helmholtz vortices with subsequent vortex pairing by higher-frequency Kelvin-Helmholtz structures of lower energy. These efforts target a human interpretable, fully automated analysis of MLC identifying qualitatively different actuation regimes, distilling corresponding coherent structures, and developing a digital twin of the plant.Keywords: machine learning control, mixing layer, feedback control, model-free control
Procedia PDF Downloads 22818977 Investigating Online Literacy among Undergraduates in Malaysia
Authors: Vivien Chee Pei Wei
Abstract:
Today we live in a scenario in which letters share space with images on screens that vary in size, shape, and style. The popularization of television, then the computer and now the e-readers, tablets, and smartphones made the electronic assume the role that previously was restricted to printed materials. Since the extensive use of new technologies to produce, disseminate, collect and access electronic publications began, the changes to reading has been intensified. To be able to read online, it involves more than just utilizing specific skills, strategies, and practices, but also in negotiating multiple information sources. In this study, different perspectives of digital reading are being explored in order to define the key aspects of the term. The focus is to explore how new technologies affect how undergraduates’ reading behavior, which in turn, gives readers different reading levels and engagement with the text and other support materials in the same media. There is also the importance of the relationship between reading platforms, reading levels and formats of electronic publications. The study looks at the online reading practices of about 100 undergraduates from a local university. The data collected using the survey and interviews with the respondents are analyzed thematically. Findings from this study found that both digital and traditional reading are interrelated, and should not be viewed as separate, but complementary to each other. However, reading online complicates some of the skills required by traditional reading. Consequently, in order to successfully read and comprehend multiple sources of information online, undergraduates need regular opportunities to practice and develop their skills as part of their natural reading practices.Keywords: concepts, digital reading, literacy, traditional reading
Procedia PDF Downloads 31418976 Correlation Analysis between Sensory Processing Sensitivity (SPS), Meares-Irlen Syndrome (MIS) and Dyslexia
Authors: Kaaryn M. Cater
Abstract:
Students with sensory processing sensitivity (SPS), Meares-Irlen Syndrome (MIS) and dyslexia can become overwhelmed and struggle to thrive in traditional tertiary learning environments. An estimated 50% of tertiary students who disclose learning related issues are dyslexic. This study explores the relationship between SPS, MIS and dyslexia. Baseline measures will be analysed to establish any correlation between these three minority methods of information processing. SPS is an innate sensitivity trait found in 15-20% of the population and has been identified in over 100 species of animals. Humans with SPS are referred to as Highly Sensitive People (HSP) and the measure of HSP is a 27 point self-test known as the Highly Sensitive Person Scale (HSPS). A 2016 study conducted by the author established base-line data for HSP students in a tertiary institution in New Zealand. The results of the study showed that all participating HSP students believed the knowledge of SPS to be life-changing and useful in managing life and study, in addition, they believed that all tutors and in-coming students should be given information on SPS. MIS is a visual processing and perception disorder that is found in approximately 10% of the population and has a variety of symptoms including visual fatigue, headaches and nausea. One way to ease some of these symptoms is through the use of colored lenses or overlays. Dyslexia is a complex phonological based information processing variation present in approximately 10% of the population. An estimated 50% of dyslexics are thought to have MIS. The study exploring possible correlations between these minority forms of information processing is due to begin in February 2017. An invitation will be extended to all first year students enrolled in degree programmes across all faculties and schools within the institution. An estimated 900 students will be eligible to participate in the study. Participants will be asked to complete a battery of on-line questionnaires including the Highly Sensitive Person Scale, the International Dyslexia Association adult self-assessment and the adapted Irlen indicator. All three scales have been used extensively in literature and have been validated among many populations. All participants whose score on any (or some) of the three questionnaires suggest a minority method of information processing will receive an invitation to meet with a learning advisor, and given access to counselling services if they choose. Meeting with a learning advisor is not mandatory, and some participants may choose not to receive help. Data will be collected using the Question Pro platform and base-line data will be analysed using correlation and regression analysis to identify relationships and predictors between SPS, MIS and dyslexia. This study forms part of a larger three year longitudinal study and participants will be required to complete questionnaires at annual intervals in subsequent years of the study until completion of (or withdrawal from) their degree. At these data collection points, participants will be questioned on any additional support received relating to their minority method(s) of information processing. Data from this study will be available by April 2017.Keywords: dyslexia, highly sensitive person (HSP), Meares-Irlen Syndrome (MIS), minority forms of information processing, sensory processing sensitivity (SPS)
Procedia PDF Downloads 25018975 Effective Glosses in Reading to Help L2 Vocabulary Learning for Low-Intermediate Technology University Students in Taiwan
Authors: Pi-Lan Yang
Abstract:
It is controversial which type of gloss condition (i.e., gloss language or gloss position) is more effective in second or foreign language (L2) vocabulary learning. The present study compared the performance on learning ten English words in the conditions of L2 English reading with no glosses and with glosses of Chinese equivalents/translations and L2 English definitions at the side of a page and at an attached sheet for low-intermediate Chinese-speaking learners of English, who were technology university students in Taiwan. It is found first that the performances on the immediate posttest and the delayed posttest were overall better in the gloss condition than those in the no-gloss condition. Next, it is found that the glosses of Chinese translations were more effective and sustainable than those of L2 English definitions. Finally, the effects of L2 English glosses at the side of a page were observed to be less sustainable than those at an attached sheet. In addition, an opinion questionnaire used also showed a preference for the glosses of Chinese translations in L2 English reading. These results would be discussed in terms of automated lexical access, sentence processing mechanisms, and the trade-off nature of storage and processing functions in working memory system, proposed by the capacity theory of language comprehension.Keywords: glosses of Chinese equivalents/translations, glosses of L2 English definitions, L2 vocabulary learning, L2 English reading
Procedia PDF Downloads 25018974 Examining the Relational Approach Elements in City Development Strategy of Qazvin 2031
Authors: Majid Etaati, Hamid Majedi
Abstract:
Relational planning approach proposed by Patsy Healey goes beyond the physical proximity and emphasizes social proximity. This approach stresses the importance of nodes and flows between nodes. Current plans in European cities have incrementally incorporated this approach, but urban plans in Iran have still stayed very detailed and rigid. In response to the weak evaluation results of the comprehensive planning approach in Qazvin, the local authorities applied the City Development Strategy (CDS) to cope with new urban challenges. The paper begins with an explanation of relational planning and suggests that Healey gives urban planners about spatial strategies and then it surveys relational factors in CDS of Qazvin. This study analyzes the extent which CDS of Qazvin have highlighted nodes, flows, and dynamics. In the end, the study concludes that there is a relational understanding of urban dynamics in the plan, but it is weak.Keywords: relational, dynamics, city development strategy, urban planning, Qazvin
Procedia PDF Downloads 14318973 The Sublimation Of Personal Drama Into Mythological Tale: ‘‘The Search Of Golden Fleece’’ By Alexander Mcqueen, Givenchy
Authors: Ani Hambardzumyan
Abstract:
The influence of Greek culture and Greek mythology on the fashion industry is enormous. The first reason behind this is that Greek culture is one of the core elements to form the clothing tradition in Europe. French fashion houses have always been considered one of the leading cloth representatives in the world. As we could perceive in the first chapter, they are among the first ones to get inspired from Greek cultural heritage and apply it while creating their garments. The French fashion industry has kept traditional classical elements in clothes for decades. However, from the second half of the 20th century, this idea started to alter step by step. Society was transforming its vision with the influence of avant-garde movements. Hence, the fashion industry needed to transform its conception as well. However, it should be mentioned that fashion brands never stopped looking at the past when creating a new perspective or vision. Paradoxically, Greek mythology and clothing tradition continued to be applied even in the search of new ideas or new interpretations. In 1997 Alexander McQueen presents his first Haute Couture collection for French fashion house Givenchy, inspired by Greek mythology and titled ‘‘Search for The Golden Fleece.’’ Perhaps, this was one of the most controversial Haute Couture shows that French audience could expect to see and French media could capture and write about. The paper discuss Spring/Summer 1997 collection ‘‘The Search of Golden Fleece’’ by Alexander McQueen. It should be mentioned that there has not been yet conducted researches to analyze the mythological and archetypal nature of the collection, as well as general observations that go beyond traditional historical reviews are few in number. Here we will observe designer’s transformative new approach regarding Greek heritage and the media’s perception of it while collection was presented. On top of that, we will observe Alexander McQueen life in the parallel line with the fashion show since the collection is nothing else but the sublimation of his personal journey and drama.Keywords: mythology, mcqueen, the argonaut, french fashion, golden fleece, givenchy
Procedia PDF Downloads 12418972 Non-Linear Regression Modeling for Composite Distributions
Authors: Mostafa Aminzadeh, Min Deng
Abstract:
Modeling loss data is an important part of actuarial science. Actuaries use models to predict future losses and manage financial risk, which can be beneficial for marketing purposes. In the insurance industry, small claims happen frequently while large claims are rare. Traditional distributions such as Normal, Exponential, and inverse-Gaussian are not suitable for describing insurance data, which often show skewness and fat tails. Several authors have studied classical and Bayesian inference for parameters of composite distributions, such as Exponential-Pareto, Weibull-Pareto, and Inverse Gamma-Pareto. These models separate small to moderate losses from large losses using a threshold parameter. This research introduces a computational approach using a nonlinear regression model for loss data that relies on multiple predictors. Simulation studies were conducted to assess the accuracy of the proposed estimation method. The simulations confirmed that the proposed method provides precise estimates for regression parameters. It's important to note that this approach can be applied to datasets if goodness-of-fit tests confirm that the composite distribution under study fits the data well. To demonstrate the computations, a real data set from the insurance industry is analyzed. A Mathematica code uses the Fisher information algorithm as an iteration method to obtain the maximum likelihood estimation (MLE) of regression parameters.Keywords: maximum likelihood estimation, fisher scoring method, non-linear regression models, composite distributions
Procedia PDF Downloads 4018971 Interactive Learning Practices for Class Room Teaching
Authors: Shamshuddin K., Nagaraj Vannal, Diwakar Kulkarni
Abstract:
This paper presents details of teaching and learning pedagogical techniques attempted for the undergraduate engineering program to improve the concentration span of students in a classroom. The details of activities such as valid statement, quiz competition, classroom paper, group work and product marketing to make the students remain active for the entire class duration and to improve presentation skills are presented. These activities shown tremendous improvement in student’s performance in academics, also in asking questions, concept understanding and interaction with the course instructor. With these pedagogical activities we are able to achieve Program outcome elements and ABET Program outcomes such as d, i, g and h which are difficult to achieve through the conventional teaching methods.Keywords: activities, pedagogy, interactive learning, valid statement, quiz competition, classroom papers, group work, product marketing
Procedia PDF Downloads 65018970 Extracting Attributes for Twitter Hashtag Communities
Authors: Ashwaq Alsulami, Jianhua Shao
Abstract:
Various organisations often need to understand discussions on social media, such as what trending topics are and characteristics of the people engaged in the discussion. A number of approaches have been proposed to extract attributes that would characterise a discussion group. However, these approaches are largely based on supervised learning, and as such they require a large amount of labelled data. We propose an approach in this paper that does not require labelled data, but rely on lexical sources to detect meaningful attributes for online discussion groups. Our findings show an acceptable level of accuracy in detecting attributes for Twitter discussion groups.Keywords: attributed community, attribute detection, community, social network
Procedia PDF Downloads 16518969 The Impact of Social Interaction, Wellbeing and Mental Health on Student Achievement During COVID-19 Lockdown in Saudi Arabia
Authors: Shatha Ahmad Alharthi
Abstract:
Prior research suggests that reduced social interaction can negatively affect well-being and impair mental health (e.g., depression and anxiety), resulting in lower academic performance. The COVID-19 pandemic has significantly limited social interaction among Saudi Arabian school children since the government closed schools and implemented lockdown restrictions to reduce the spread of the disease. These restrictions have resulted in prolonged remote learning for middle school students with unknown consequences for perceived academic performance, mental health, and well-being. This research project explores how middle school Saudi students’ current remote learning practices affect their mental health (e.g., depression and anxiety) and well-being during the lockdown. Furthermore, the study will examine the association between social interaction, mental health, and well-being pertaining to students’ perceptions of their academic achievement. Research findings could lead to a better understanding of the role of lockdown on depression, anxiety, well-being and perceived academic performance. Research findings may also inform policy-makers or practitioners (e.g., teachers and school leaders) about the importance of facilitating increased social interactions in remote learning situations and help to identify important factors to consider when seeking to re-integrate students into a face-to-face classroom setting. Potential implications for future educational research include exploring remote learning interventions targeted at bolstering students’ mental health and academic achievement during periods of remote learning.Keywords: depression, anxiety, academic performance, social interaction
Procedia PDF Downloads 12218968 Using Autoencoder as Feature Extractor for Malware Detection
Authors: Umm-E-Hani, Faiza Babar, Hanif Durad
Abstract:
Malware-detecting approaches suffer many limitations, due to which all anti-malware solutions have failed to be reliable enough for detecting zero-day malware. Signature-based solutions depend upon the signatures that can be generated only when malware surfaces at least once in the cyber world. Another approach that works by detecting the anomalies caused in the environment can easily be defeated by diligently and intelligently written malware. Solutions that have been trained to observe the behavior for detecting malicious files have failed to cater to the malware capable of detecting the sandboxed or protected environment. Machine learning and deep learning-based approaches greatly suffer in training their models with either an imbalanced dataset or an inadequate number of samples. AI-based anti-malware solutions that have been trained with enough samples targeted a selected feature vector, thus ignoring the input of leftover features in the maliciousness of malware just to cope with the lack of underlying hardware processing power. Our research focuses on producing an anti-malware solution for detecting malicious PE files by circumventing the earlier-mentioned shortcomings. Our proposed framework, which is based on automated feature engineering through autoencoders, trains the model over a fairly large dataset. It focuses on the visual patterns of malware samples to automatically extract the meaningful part of the visual pattern. Our experiment has successfully produced a state-of-the-art accuracy of 99.54 % over test data.Keywords: malware, auto encoders, automated feature engineering, classification
Procedia PDF Downloads 7518967 Leveraging Digital Transformation Initiatives and Artificial Intelligence to Optimize Readiness and Simulate Mission Performance across the Fleet
Authors: Justin Woulfe
Abstract:
Siloed logistics and supply chain management systems throughout the Department of Defense (DOD) has led to disparate approaches to modeling and simulation (M&S), a lack of understanding of how one system impacts the whole, and issues with “optimal” solutions that are good for one organization but have dramatic negative impacts on another. Many different systems have evolved to try to understand and account for uncertainty and try to reduce the consequences of the unknown. As the DoD undertakes expansive digital transformation initiatives, there is an opportunity to fuse and leverage traditionally disparate data into a centrally hosted source of truth. With a streamlined process incorporating machine learning (ML) and artificial intelligence (AI), advanced M&S will enable informed decisions guiding program success via optimized operational readiness and improved mission success. One of the current challenges is to leverage the terabytes of data generated by monitored systems to provide actionable information for all levels of users. The implementation of a cloud-based application analyzing data transactions, learning and predicting future states from current and past states in real-time, and communicating those anticipated states is an appropriate solution for the purposes of reduced latency and improved confidence in decisions. Decisions made from an ML and AI application combined with advanced optimization algorithms will improve the mission success and performance of systems, which will improve the overall cost and effectiveness of any program. The Systecon team constructs and employs model-based simulations, cutting across traditional silos of data, aggregating maintenance, and supply data, incorporating sensor information, and applying optimization and simulation methods to an as-maintained digital twin with the ability to aggregate results across a system’s lifecycle and across logical and operational groupings of systems. This coupling of data throughout the enterprise enables tactical, operational, and strategic decision support, detachable and deployable logistics services, and configuration-based automated distribution of digital technical and product data to enhance supply and logistics operations. As a complete solution, this approach significantly reduces program risk by allowing flexible configuration of data, data relationships, business process workflows, and early test and evaluation, especially budget trade-off analyses. A true capability to tie resources (dollars) to weapon system readiness in alignment with the real-world scenarios a warfighter may experience has been an objective yet to be realized to date. By developing and solidifying an organic capability to directly relate dollars to readiness and to inform the digital twin, the decision-maker is now empowered through valuable insight and traceability. This type of educated decision-making provides an advantage over the adversaries who struggle with maintaining system readiness at an affordable cost. The M&S capability developed allows program managers to independently evaluate system design and support decisions by quantifying their impact on operational availability and operations and support cost resulting in the ability to simultaneously optimize readiness and cost. This will allow the stakeholders to make data-driven decisions when trading cost and readiness throughout the life of the program. Finally, sponsors are available to validate product deliverables with efficiency and much higher accuracy than in previous years.Keywords: artificial intelligence, digital transformation, machine learning, predictive analytics
Procedia PDF Downloads 16518966 Positive Impact of Cartoon Movies on Adults
Authors: Yacoub Aljaffery
Abstract:
As much as we think negatively about social media such as TV and smart phones, there are many positive benefits our society can get from it. Cartoons, for example, are made specifically for children. However, in this paper, we will prove how cartoon videos can have a positive impact on adults, especially college students. Since cartoons are meant to be a good learning tool for children, as well as adults, we will show our audience how they can use cartoon in teaching critical thinking and other language skills.Keywords: social media, TV, teaching, learning, cartoon movies
Procedia PDF Downloads 32618965 Wolof Voice Response Recognition System: A Deep Learning Model for Wolof Audio Classification
Authors: Krishna Mohan Bathula, Fatou Bintou Loucoubar, FNU Kaleemunnisa, Christelle Scharff, Mark Anthony De Castro
Abstract:
Voice recognition algorithms such as automatic speech recognition and text-to-speech systems with African languages can play an important role in bridging the digital divide of Artificial Intelligence in Africa, contributing to the establishment of a fully inclusive information society. This paper proposes a Deep Learning model that can classify the user responses as inputs for an interactive voice response system. A dataset with Wolof language words ‘yes’ and ‘no’ is collected as audio recordings. A two stage Data Augmentation approach is adopted for enhancing the dataset size required by the deep neural network. Data preprocessing and feature engineering with Mel-Frequency Cepstral Coefficients are implemented. Convolutional Neural Networks (CNNs) have proven to be very powerful in image classification and are promising for audio processing when sounds are transformed into spectra. For performing voice response classification, the recordings are transformed into sound frequency feature spectra and then applied image classification methodology using a deep CNN model. The inference model of this trained and reusable Wolof voice response recognition system can be integrated with many applications associated with both web and mobile platforms.Keywords: automatic speech recognition, interactive voice response, voice response recognition, wolof word classification
Procedia PDF Downloads 12218964 Ethnomedicinal Plants Used for Gastrointestinal Ailments by the People of Tribal District Kinnaur (Himachal Pradesh) India
Authors: Geeta, Richa, M. L. Sharma
Abstract:
Himachal Pradesh, a hilly State of India located in the Western Himalayas, with varied altitudinal gradients and climatic conditions, is a repository of plant diversity and the traditional knowledge associated with plants. The State is inhabited by various tribal communities who usually depend upon local plants for curing various ailments. Utilization of plant resources in their day-to-day life has been an age old practice of the people inhabiting this State. The present study pertains to the tribal district Kinnaur of Himachal Pradesh, located between 77°45’ and 79°00’35” east longitudes and between 31°05’50” and 32°05’15” north altitudes. Being a remote area with only very basic medical facilities, local people mostly use traditional herbal medicines for primary healthcare needs. Traditional healers called “Amji” are usually very secretive in revealing their medicinal knowledge to novice and pass on their knowledge to next generation orally. As a result, no written records of healing herbs are available. The aim of present study was to collect and consolidate the ethno-medicinal knowledge of local people of the district about the use of plants for treating gastrointestinal ailments. The ethnobotanical information was collected from the local practitioners, herbal healers and elderly people having rich knowledge about the medicinal herbs through semi-structured questionnaire and key informant discussions. A total 46 plant species belonging to 40 genera and 24 families have been identified which are used as cure for gastrointestinal ailments. Among the parts used for gastointestinal ailments, aerial parts (14%) were followed by the whole plant (13%), root (8%), leaves (6%), flower (5%), fruit and seed (3%) and tuber (1%). These plant species could be prioritized for conservation and subject to further studies related to phytochemical screening for their authenticity. Most of the medicinal plants of the region are collected from the wild and are often harvested for trade. Sustainable harvesting and domestication of the highly traded species from the study area is needed.Keywords: Amji, gastrointestinal, Kinnaur, medicinal plants, traditional knowledge
Procedia PDF Downloads 39818963 Developing a Framework for Open Source Software Adoption in a Higher Education Institution in Uganda. A case of Kyambogo University
Authors: Kafeero Frank
Abstract:
This study aimed at developing a frame work for open source software adoption in an institution of higher learning in Uganda, with the case of KIU as a study area. There were mainly four research questions based on; individual staff interaction with open source software forum, perceived FOSS characteristics, organizational characteristics and external characteristics as factors that affect open source software adoption. The researcher used causal-correlation research design to study effects of these variables on open source software adoption. A quantitative approach was used in this study with self-administered questionnaire on a purposively and randomly sampled sample of university ICT staff. Resultant data was analyzed using means, correlation coefficients and multivariate multiple regression analysis as statistical tools. The study reveals that individual staff interaction with open source software forum and perceived FOSS characteristics were the primary factors that significantly affect FOSS adoption while organizational and external factors were secondary with no significant effect but significant correlation to open source software adoption. It was concluded that for effective open source software adoption to occur there must be more effort on primary factors with subsequent reinforcement of secondary factors to fulfill the primary factors and adoption of open source software. Lastly recommendations were made in line with conclusions for coming up with Kyambogo University frame work for open source software adoption in institutions of higher learning. Areas of further research recommended include; Stakeholders’ analysis of open source software adoption in Uganda; Challenges and way forward. Evaluation of Kyambogo University frame work for open source software adoption in institutions of higher learning. Framework development for cloud computing adoption in Ugandan universities. Framework for FOSS development in Uganda IT industryKeywords: open source software., organisational characteristics, external characteristics, cloud computing adoption
Procedia PDF Downloads 7618962 Medicompills Architecture: A Mathematical Precise Tool to Reduce the Risk of Diagnosis Errors on Precise Medicine
Authors: Adriana Haulica
Abstract:
Powered by Machine Learning, Precise medicine is tailored by now to use genetic and molecular profiling, with the aim of optimizing the therapeutic benefits for cohorts of patients. As the majority of Machine Language algorithms come from heuristics, the outputs have contextual validity. This is not very restrictive in the sense that medicine itself is not an exact science. Meanwhile, the progress made in Molecular Biology, Bioinformatics, Computational Biology, and Precise Medicine, correlated with the huge amount of human biology data and the increase in computational power, opens new healthcare challenges. A more accurate diagnosis is needed along with real-time treatments by processing as much as possible from the available information. The purpose of this paper is to present a deeper vision for the future of Artificial Intelligence in Precise medicine. In fact, actual Machine Learning algorithms use standard mathematical knowledge, mostly Euclidian metrics and standard computation rules. The loss of information arising from the classical methods prevents obtaining 100% evidence on the diagnosis process. To overcome these problems, we introduce MEDICOMPILLS, a new architectural concept tool of information processing in Precise medicine that delivers diagnosis and therapy advice. This tool processes poly-field digital resources: global knowledge related to biomedicine in a direct or indirect manner but also technical databases, Natural Language Processing algorithms, and strong class optimization functions. As the name suggests, the heart of this tool is a compiler. The approach is completely new, tailored for omics and clinical data. Firstly, the intrinsic biological intuition is different from the well-known “a needle in a haystack” approach usually used when Machine Learning algorithms have to process differential genomic or molecular data to find biomarkers. Also, even if the input is seized from various types of data, the working engine inside the MEDICOMPILLS does not search for patterns as an integrative tool. This approach deciphers the biological meaning of input data up to the metabolic and physiologic mechanisms, based on a compiler with grammars issued from bio-algebra-inspired mathematics. It translates input data into bio-semantic units with the help of contextual information iteratively until Bio-Logical operations can be performed on the base of the “common denominator “rule. The rigorousness of MEDICOMPILLS comes from the structure of the contextual information on functions, built to be analogous to mathematical “proofs”. The major impact of this architecture is expressed by the high accuracy of the diagnosis. Detected as a multiple conditions diagnostic, constituted by some main diseases along with unhealthy biological states, this format is highly suitable for therapy proposal and disease prevention. The use of MEDICOMPILLS architecture is highly beneficial for the healthcare industry. The expectation is to generate a strategic trend in Precise medicine, making medicine more like an exact science and reducing the considerable risk of errors in diagnostics and therapies. The tool can be used by pharmaceutical laboratories for the discovery of new cures. It will also contribute to better design of clinical trials and speed them up.Keywords: bio-semantic units, multiple conditions diagnosis, NLP, omics
Procedia PDF Downloads 7518961 Towards an Eastern Philosophy of Religion: on the Contradictory Identity of Philosophy and Religion
Authors: Carlo Cogliati
Abstract:
The study of the relationship of philosophical reason with the religious domain has been very much a concern for many of the Western philosophical and theological traditions. In this essay, I will suggest a proposal for an Eastern philosophy of religion based on Nishida’s contradictory identity of the two: philosophy soku hi (is, and yes is not) religion. This will pose a challenge to the traditional Western contents and methods of the discipline. This paper aims to serve three purposes. First, I will critically assess Charlesworth’s typology of the relation between philosophy and religion in the West: philosophy as/for/against/about/after religion. I will also engage Harrison’s call for a global philosophy of religion(s) and argue that, although it expands the scope and the range of the questions to address, it is still Western in its method. Second, I will present Nishida’s logic of absolutely contradictory self-identity as the instrument to transcend the dichotomous pair of identity and contradiction: ‘A is A’ and ‘A is not A’. I will then explain how this ‘concrete’ logic of the East, as opposed to the ‘formal’ logic of the West, exhibits at best the bilateral dynamic relation between philosophy and religion. Even as Nishida argues for the non-separability of the two, he is also aware and committed to their mutual non-reducibility. Finally, I will outline the resulting new relation between God and creatures. Nishida in his philosophy soku hi religion replaces the traditional Western dualistic concept of God with the Eastern non-dualistic understanding of God as “neither transcendent nor immanent, and at the same time both transcendent and immanent.” God is therefore a self-identity of contradiction, nowhere and yet everywhere present in the world of creatures. God as absolute being is also absolute nothingness: the world of creatures is the expression of God’s absolute self-negation. The overreaching goal of this essay is to offer an alternative to traditional Western approaches to philosophy of religion based on Nishida’s logic of absolutely contradictory self-identity, as an example of philosophical and religious counter(influence). The resulting relationship between philosophy and religion calls for a revision of traditional concepts and methods. The outcome is not to reformulate the Eastern predilection to not sharply distinguish philosophical thought from religious enlightenment rather to bring together philosophy and religion in the place of identity and difference.Keywords: basho, Nishida Kitaro, shukyotetsugaku, soku hi, zettai mujunteki jikodoitsu no ronri
Procedia PDF Downloads 19618960 Medical Imaging Fusion: A Teaching-Learning Simulation Environment
Authors: Cristina Maria Ribeiro Martins Pereira Caridade, Ana Rita Ferreira Morais
Abstract:
The use of computational tools has become essential in the context of interactive learning, especially in engineering education. In the medical industry, teaching medical image processing techniques is a crucial part of training biomedical engineers, as it has integrated applications with healthcare facilities and hospitals. The aim of this article is to present a teaching-learning simulation tool developed in MATLAB using a graphical user interface for medical image fusion that explores different image fusion methodologies and processes in combination with image pre-processing techniques. The application uses different algorithms and medical fusion techniques in real time, allowing you to view original images and fusion images, compare processed and original images, adjust parameters, and save images. The tool proposed in an innovative teaching and learning environment consists of a dynamic and motivating teaching simulation for biomedical engineering students to acquire knowledge about medical image fusion techniques and necessary skills for the training of biomedical engineers. In conclusion, the developed simulation tool provides real-time visualization of the original and fusion images and the possibility to test, evaluate and progress the student’s knowledge about the fusion of medical images. It also facilitates the exploration of medical imaging applications, specifically image fusion, which is critical in the medical industry. Teachers and students can make adjustments and/or create new functions, making the simulation environment adaptable to new techniques and methodologies.Keywords: image fusion, image processing, teaching-learning simulation tool, biomedical engineering education
Procedia PDF Downloads 13718959 Fish Is Back but Fishers Are Out: The Dilemma of the Education Methods Adapted for Co-management of the Fishery Resource
Authors: Namubiru Zula, Janice Desire Busingue
Abstract:
Pro-active educational approaches have lately been adapted Globally in the Conservation of Natural Resources. This led to the introduction of the co-management system, which worked for some European Countries on the conservation of sharks and other Natural resources. However, this approach has drastically failed in the Fishery sector on Lake Victoria; and the punitive education approach has been re-instated. Literature is readily available about the punitive educational approaches and scanty with the pro-active one. This article analyses the pro-active approach adopted by the Department of Fisheries for the orientation of BMU leaders in a co-management system. The study is interpreted using the social constructivist lens for co-management of the fishery resource to ensure that fishers are also back to fishing sustainably. It highlights some of the education methods used, methodological challenges that included the power and skills gap of the facilitators and program designers, and some implications to practice.Keywords: beach management units, fishers, education methods, proactive approach, punitive approach
Procedia PDF Downloads 12918958 Nutritional Potential and Traditional Uses of High Altitude Wild Edible Plants in Eastern Himalayas, India
Authors: Hui Tag, Jambey Tsering, Pallabi Kalita Hui, Baikuntha Jyoti Gogoi, Vijay Veer
Abstract:
The food security issues and its relevance in High Mountain regions of the world have been often neglected. Wild edible plants have been playing a major role in livelihood security among the tribal Communities of East Himalayan Region of the world since time immemorial. The Eastern Himalayan Region of India is one of the mega diverse regions of world and rated as top 12th Global Biodiversity Hotspots by IUCN and recognized as one of the 200 significant eco-regions of the Globe. The region supports one of the world’s richest alpine floras and about one-third of them are endemic to the region. There are at least 7,500 flowering plants, 700 orchids, 58 bamboo species, 64 citrus species, 28 conifers, 500 mosses, 700 ferns and 728 lichens. The region is the home of more than three hundred different ethnic communities having diverse knowledge on traditional uses of flora and fauna as food, medicine and beverages. Monpa, Memba and Khamba are among the local communities residing in high altitude region of Eastern Himalaya with rich traditional knowledge related to utilization of wild edible plants. The Monpas, Memba and Khamba are the followers Mahayana sect of Himalayan Buddhism and they are mostly agrarian by primary occupation and also heavily relaying on wild edible plants for their livelihood security during famine since millennia. In the present study, we have reported traditional uses of 40 wild edible plant species and out of which 6 species were analysed at biochemical level for nutrients contents and free radical scavenging activities. The results have shown significant free radical scavenging (antioxidant) activity and nutritional potential of the selected 6 wild edible plants used by the local communities of Eastern Himalayan Region of India.Keywords: East Himalaya, local community, wild edible plants, nutrition, food security
Procedia PDF Downloads 26718957 The Data-Driven Localized Wave Solution of the Fokas-Lenells Equation using PINN
Authors: Gautam Kumar Saharia, Sagardeep Talukdar, Riki Dutta, Sudipta Nandy
Abstract:
The physics informed neural network (PINN) method opens up an approach for numerically solving nonlinear partial differential equations leveraging fast calculating speed and high precession of modern computing systems. We construct the PINN based on strong universal approximation theorem and apply the initial-boundary value data and residual collocation points to weekly impose initial and boundary condition to the neural network and choose the optimization algorithms adaptive moment estimation (ADAM) and Limited-memory Broyden-Fletcher-Golfard-Shanno (L-BFGS) algorithm to optimize learnable parameter of the neural network. Next, we improve the PINN with a weighted loss function to obtain both the bright and dark soliton solutions of Fokas-Lenells equation (FLE). We find the proposed scheme of adjustable weight coefficients into PINN has a better convergence rate and generalizability than the basic PINN algorithm. We believe that the PINN approach to solve the partial differential equation appearing in nonlinear optics would be useful to study various optical phenomena.Keywords: deep learning, optical Soliton, neural network, partial differential equation
Procedia PDF Downloads 13418956 Synthetic Classicism: A Machine Learning Approach to the Recognition and Design of Circular Pavilions
Authors: Federico Garrido, Mostafa El Hayani, Ahmed Shams
Abstract:
The exploration of the potential of artificial intelligence (AI) in architecture is still embryonic, however, its latent capacity to change design disciplines is significant. 'Synthetic Classism' is a research project that questions the underlying aspects of classically organized architecture not just in aesthetic terms but also from a geometrical and morphological point of view, intending to generate new architectural information using historical examples as source material. The main aim of this paper is to explore the uses of artificial intelligence and machine learning algorithms in architectural design while creating a coherent narrative to be contained within a design process. The purpose is twofold: on one hand, to develop and train machine learning algorithms to produce architectural information of small pavilions and on the other, to synthesize new information from previous architectural drawings. These algorithms intend to 'interpret' graphical information from each pavilion and then generate new information from it. The procedure, once these algorithms are trained, is the following: parting from a line profile, a synthetic 'front view' of a pavilion is generated, then using it as a source material, an isometric view is created from it, and finally, a top view is produced. Thanks to GAN algorithms, it is also possible to generate Front and Isometric views without any graphical input as well. The final intention of the research is to produce isometric views out of historical information, such as the pavilions from Sebastiano Serlio, James Gibbs, or John Soane. The idea is to create and interpret new information not just in terms of historical reconstruction but also to explore AI as a novel tool in the narrative of a creative design process. This research also challenges the idea of the role of algorithmic design associated with efficiency or fitness while embracing the possibility of a creative collaboration between artificial intelligence and a human designer. Hence the double feature of this research, both analytical and creative, first by synthesizing images based on a given dataset and then by generating new architectural information from historical references. We find that the possibility of creatively understand and manipulate historic (and synthetic) information will be a key feature in future innovative design processes. Finally, the main question that we propose is whether an AI could be used not just to create an original and innovative group of simple buildings but also to explore the possibility of fostering a novel architectural sensibility grounded on the specificities on the architectural dataset, either historic, human-made or synthetic.Keywords: architecture, central pavilions, classicism, machine learning
Procedia PDF Downloads 14418955 An Approach to Solving Some Inverse Problems for Parabolic Equations
Authors: Bolatbek Rysbaiuly, Aliya S. Azhibekova
Abstract:
Problems concerning the interpretation of the well testing results belong to the class of inverse problems of subsurface hydromechanics. The distinctive feature of such problems is that additional information is depending on the capabilities of oilfield experiments. Another factor that should not be overlooked is the existence of errors in the test data. To determine reservoir properties, some inverse problems for parabolic equations were investigated. An approach to solving the inverse problems based on the method of regularization is proposed.Keywords: iterative approach, inverse problem, parabolic equation, reservoir properties
Procedia PDF Downloads 432