Search results for: pedagogical approaches
3656 Uncertainty Evaluation of Erosion Volume Measurement Using Coordinate Measuring Machine
Authors: Mohamed Dhouibi, Bogdan Stirbu, Chabotier André, Marc Pirlot
Abstract:
Internal barrel wear is a major factor affecting the performance of small caliber guns in their different life phases. Wear analysis is, therefore, a very important process for understanding how wear occurs, where it takes place, and how it spreads with the aim on improving the accuracy and effectiveness of small caliber weapons. This paper discusses the measurement and analysis of combustion chamber wear for a small-caliber gun using a Coordinate Measuring Machine (CMM). Initially, two different NATO small caliber guns: 5.56x45mm and 7.62x51mm, are considered. A Micura Zeiss Coordinate Measuring Machine (CMM) equipped with the VAST XTR gold high-end sensor is used to measure the inner profile of the two guns every 300-shot cycle. The CMM parameters, such us (i) the measuring force, (ii) the measured points, (iii) the time of masking, and (iv) the scanning velocity, are investigated. In order to ensure minimum measurement error, a statistical analysis is adopted to select the reliable CMM parameters combination. Next, two measurement strategies are developed to capture the shape and the volume of each gun chamber. Thus, a task-specific measurement uncertainty (TSMU) analysis is carried out for each measurement plan. Different approaches of TSMU evaluation have been proposed in the literature. This paper discusses two different techniques. The first is the substitution method described in ISO 15530 part 3. This approach is based on the use of calibrated workpieces with similar shape and size as the measured part. The second is the Monte Carlo simulation method presented in ISO 15530 part 4. Uncertainty evaluation software (UES), also known as the Virtual Coordinate Measuring Machine (VCMM), is utilized in this technique to perform a point-by-point simulation of the measurements. To conclude, a comparison between both approaches is performed. Finally, the results of the measurements are verified through calibrated gauges of several dimensions specially designed for the two barrels. On this basis, an experimental database is developed for further analysis aiming to quantify the relationship between the volume of wear and the muzzle velocity of small caliber guns.Keywords: coordinate measuring machine, measurement uncertainty, erosion and wear volume, small caliber guns
Procedia PDF Downloads 1523655 A Model for Reverse-Mentoring in Education
Authors: Sabine A. Zauchner-Studnicka
Abstract:
As the term indicates, reverse-mentoring flips the classical roles of mentoring: In school, students take over the role of mentors for adults, i.e. teachers or parents. Originally reverse-mentoring stems from US enterprises, which implemented this innovative method in order to benefit from the resources of skilled younger employees for the enhancement of IT competences of senior colleagues. However, reverse-mentoring in schools worldwide is rare. Based on empirical studies and theoretical approaches, in this article an implementation model for reverse-mentoring is developed in order to bring the significant potential reverse-mentoring has for education into practice.Keywords: reverse-mentoring, innovation in education, implementation model, school education
Procedia PDF Downloads 2483654 Cell-Cell Interactions in Diseased Conditions Revealed by Three Dimensional and Intravital Two Photon Microscope: From Visualization to Quantification
Authors: Satoshi Nishimura
Abstract:
Although much information has been garnered from the genomes of humans and mice, it remains difficult to extend that information to explain physiological and pathological phenomena. This is because the processes underlying life are by nature stochastic and fluctuate with time. Thus, we developed novel "in vivo molecular imaging" method based on single and two-photon microscopy. We visualized and analyzed many life phenomena, including common adult diseases. We integrated the knowledge obtained, and established new models that will serve as the basis for new minimally invasive therapeutic approaches.Keywords: two photon microscope, intravital visualization, thrombus, artery
Procedia PDF Downloads 3733653 Automated Adaptions of Semantic User- and Service Profile Representations by Learning the User Context
Authors: Nicole Merkle, Stefan Zander
Abstract:
Ambient Assisted Living (AAL) describes a technological and methodological stack of (e.g. formal model-theoretic semantics, rule-based reasoning and machine learning), different aspects regarding the behavior, activities and characteristics of humans. Hence, a semantic representation of the user environment and its relevant elements are required in order to allow assistive agents to recognize situations and deduce appropriate actions. Furthermore, the user and his/her characteristics (e.g. physical, cognitive, preferences) need to be represented with a high degree of expressiveness in order to allow software agents a precise evaluation of the users’ context models. The correct interpretation of these context models highly depends on temporal, spatial circumstances as well as individual user preferences. In most AAL approaches, model representations of real world situations represent the current state of a universe of discourse at a given point in time by neglecting transitions between a set of states. However, the AAL domain currently lacks sufficient approaches that contemplate on the dynamic adaptions of context-related representations. Semantic representations of relevant real-world excerpts (e.g. user activities) help cognitive, rule-based agents to reason and make decisions in order to help users in appropriate tasks and situations. Furthermore, rules and reasoning on semantic models are not sufficient for handling uncertainty and fuzzy situations. A certain situation can require different (re-)actions in order to achieve the best results with respect to the user and his/her needs. But what is the best result? To answer this question, we need to consider that every smart agent requires to achieve an objective, but this objective is mostly defined by domain experts who can also fail in their estimation of what is desired by the user and what not. Hence, a smart agent has to be able to learn from context history data and estimate or predict what is most likely in certain contexts. Furthermore, different agents with contrary objectives can cause collisions as their actions influence the user’s context and constituting conditions in unintended or uncontrolled ways. We present an approach for dynamically updating a semantic model with respect to the current user context that allows flexibility of the software agents and enhances their conformance in order to improve the user experience. The presented approach adapts rules by learning sensor evidence and user actions using probabilistic reasoning approaches, based on given expert knowledge. The semantic domain model consists basically of device-, service- and user profile representations. In this paper, we present how this semantic domain model can be used in order to compute the probability of matching rules and actions. We apply this probability estimation to compare the current domain model representation with the computed one in order to adapt the formal semantic representation. Our approach aims at minimizing the likelihood of unintended interferences in order to eliminate conflicts and unpredictable side-effects by updating pre-defined expert knowledge according to the most probable context representation. This enables agents to adapt to dynamic changes in the environment which enhances the provision of adequate assistance and affects positively the user satisfaction.Keywords: ambient intelligence, machine learning, semantic web, software agents
Procedia PDF Downloads 2813652 The Impact of Building Technologies on Local Identity of Urban Settlements
Authors: Eman Nagi Gowid Selim
Abstract:
Nowadays, the relevance of places to people has been questioned from different perspectives. This is attributed to the fact that many international concrete blocks were used to create multi-use public spaces in neighborhoods based on the techniques of mass-productions concepts that became one of the most effective ways in building construction, replacing the local and traditional built environment. During the last decades, the world has become increasingly globalized and citizen more mobilized, and thus, ignoring the social and environmental dimensions of the local identity. The main enquiries of the research are “How did building technologies affect urban settlement’s identity?” and “What are the impacts of technologies and globalization on local identities in urban spaces? “From this perspective, the research presents firstly, a historical review that shows how old civilizations enhance their local identities using the newly discovered building materials in each era in different urban settlement and fabrics without losing the identity. The second part of the research highlights the different approaches of building technologies and urban design to present a clear understanding of ways of applying and merging between different methodologies to achieve the most efficient urban space design. The third part aims at analyzing some international and national case studies where the form and structure of particular spaces are vital to identifying the morphological elements of urban settlements and the links existing between them. In addition, it determines how the building materials are used to enrich the vocabulary of the local identity. This part ends with the deduction of the guidelines for the integration of the environmental and social dimensions within the building technologies` approaches to enhance the sustainability of local identities and thus, ending up with redefining "Urban Identity" to guide future research in such cultural areas. Finally, the research uses the comparative methodology for applying the deduced guidelines on a national case study namely “Othman`s Towers” in corniche El Maadi, and then ends up by some results in the form of strategies for future researcher, that identifies how to ensure local identity in urban settlements using new building materials and technologies to achieve social and environmental comfort within the cultural areas.Keywords: building technologies, cultural context, environmental approach, participatory design, social dimension, urban spaces
Procedia PDF Downloads 3043651 The Effect of Mobile Technology Use in Education: A Meta-Analysis Study
Authors: Şirin Küçük, Ayşe Kök, İsmail Şahin
Abstract:
Mobile devices are very popular and useful tools for assisting people in daily life. With the advancement of mobile technologies, the issue of mobile learning has been widely investigated in education. Many researches consider that it is important to integrate pedagogical and technical strengths of mobile technology into learning environments. For this reason, the purpose of this research is to examine the effect of mobile technology use in education with meta-analysis method. Meta-analysis is a statistical technique which combines the findings of independent studies in a specific subject. In this respect, the articles will be examined by searching the databases for researches which are conducted between 2005 and 2014. It is expected that the results of this research will contribute to future research related to mobile technology use in education.Keywords: mobile learning, meta-analysis, mobile technology, education
Procedia PDF Downloads 7213650 Video Summarization: Techniques and Applications
Authors: Zaynab El Khattabi, Youness Tabii, Abdelhamid Benkaddour
Abstract:
Nowadays, huge amount of multimedia repositories make the browsing, retrieval and delivery of video contents very slow and even difficult tasks. Video summarization has been proposed to improve faster browsing of large video collections and more efficient content indexing and access. In this paper, we focus on approaches to video summarization. The video summaries can be generated in many different forms. However, two fundamentals ways to generate summaries are static and dynamic. We present different techniques for each mode in the literature and describe some features used for generating video summaries. We conclude with perspective for further research.Keywords: video summarization, static summarization, video skimming, semantic features
Procedia PDF Downloads 4023649 Detect Circles in Image: Using Statistical Image Analysis
Authors: Fathi M. O. Hamed, Salma F. Elkofhaifee
Abstract:
The aim of this work is to detect geometrical shape objects in an image. In this paper, the object is considered to be as a circle shape. The identification requires find three characteristics, which are number, size, and location of the object. To achieve the goal of this work, this paper presents an algorithm that combines from some of statistical approaches and image analysis techniques. This algorithm has been implemented to arrive at the major objectives in this paper. The algorithm has been evaluated by using simulated data, and yields good results, and then it has been applied to real data.Keywords: image processing, median filter, projection, scale-space, segmentation, threshold
Procedia PDF Downloads 4323648 Psychoanalytical Foreshadowing: The Application of a Literary Device in Quranic Narratology
Authors: Fateme Montazeri
Abstract:
Literary approaches towards the text of the Quran predate the modern period. Suyuti (d.1505)’s encyclopedia of Quranic sciences, Al-Itqan, provides a notable example. In the modern era, the study of the Quranic rhetorics received particular attention in the second half of the twentieth century by Egyptian scholars. Amin Al-Khouli (d. 1966), who might be considered the first to argue for the necessity of applying a literary-rhetorical lens toward the tafseer, Islamic exegesis, and his students championed the literary analysis as the most effective approach to the comprehension of the holy text. Western scholars continued the literary criticism of the Islamic scripture by applying to the Quran similar methodologies used in biblical studies. In the history of the literary examination of the Quran, the scope of the critical methods applied to the Quranic text has been limited. For, the rhetorical approaches to the Quran, in the premodern as well as the modern period, concerned almost exclusively with the lexical layer of the text, leaving the narratological dimensions insufficiently examined. Recent contributions, by Leyla Ozgur Alhassen, for instance, attempt to fill this lacunae. This paper aims at advancing the studies of the Quranic narratives by investigating the application of a literary device whose role in the Quranic stories remains unstudied, that is, “foreshadowing.” This paper shall focus on Chapter 12, “Surah al-Yusuf,” as its case study. Chapter 12, the single chapter that includes the story of Joseph in one piece, contains several instances in which the events of the story are foreshadowed. As shall be discussed, foreshadowing occurs either through a monolog or dialogue whereby one or more of the characters allude to the future happenings or through the manner in which the setting is described. Through a close reading of the text, it will be demonstrated that the usage of the rhetorical tool of foreshadowing meets a dual purpose: on the one hand, foreshadowing prepares the reader/audience for the upcoming events in the plot, and on the other hand, it highlights the psychological dimensions of the characters, their thoughts, intentions, and disposition. In analyzing the story, this study shall draw on psychoanalytical criticism to explore the layers of meanings embedded in the Quranic narrative that are unfolded through foreshadowing.Keywords: foreshadowing, quranic narrative, literary criticism, surah yusuf
Procedia PDF Downloads 1543647 Survey Paper on Graph Coloring Problem and Its Application
Authors: Prateek Chharia, Biswa Bhusan Ghosh
Abstract:
Graph coloring is one of the prominent concepts in graph coloring. It can be defined as a coloring of the various regions of the graph such that all the constraints are fulfilled. In this paper various graphs coloring approaches like greedy coloring, Heuristic search for maximum independent set and graph coloring using edge table is described. Graph coloring can be used in various real time applications like student time tabling generation, Sudoku as a graph coloring problem, GSM phone network.Keywords: graph coloring, greedy coloring, heuristic search, edge table, sudoku as a graph coloring problem
Procedia PDF Downloads 5403646 The Effect of Bihemisferic Transcranial Direct Current Stimulation Therapy on Upper Extremity Motor Functions in Stroke Patients
Authors: Dilek Cetin Alisar, Oya Umit Yemisci, Selin Ozen, Seyhan Sozay
Abstract:
New approaches and treatment modalities are being developed to make patients more functional and independent in stroke rehabilitation. One of these approaches is transcranial direct stimulation therapy (tDCS), which aims to improve the hemiplegic upper limb function of stroke patients. tDCS therapy is not in the routine rehabilitation program; however, the studies about tDCS therapy on stroke rehabilitation was increased in recent years. Evaluate the effect of tDCS treatment on upper extremity motor function in patients with subacute stroke was aimed in our study. 32 stroke patients (16 tDCS group, 16 sham groups) who were hospitalized for rehabilitation in Başkent University Physical Medicine and Rehabilitation Clinic between 01.08.2016-20.01-2018 were included in the study. The conventional upper limb rehabilitation program was used for both tDCS and control group patients for 3 weeks, 5 days a week, for 60-120 minutes a day. In addition to the conventional stroke rehabilitation program in the tDAS group, bihemispheric tDCS was administered for 30 minutes daily. Patients were evaluated before treatment and after 1 week of treatment. Functional independence measure self-care score (FIM), Brunnstorm Recovery Stage (BRS), and Fugl-Meyer (FM) upper extremity motor function scale were used. There was no difference in demographic characteristics between the groups. There were no significant differences between BRS and FM scores in two groups, but there was a significant difference FIM score (p=0.05. FIM, BRS, and FM scores are significantly in the tDCS group, when before therapy and after 1 week of therapy, however, no difference is found in the shame group (p < 0,001). When FBS and FM scores were compared, there were statistical significant differences in tDCS group (p < 0,001). In conclusion, this randomized double-blind study showed that bihemispheric tDCS treatment was found to be superior to upper extremity motor and functional enhancement in addition to conventional rehabilitation methods in subacute stroke patients. In order for tDCS therapy to be used routinely in stroke rehabilitation, there is a need for more comprehensive, long-termed, randomized controlled clinical trials in order to find answers to many questions, such as the duration and intensity of treatment.Keywords: cortical stimulation, motor function, rehabilitation, stroke
Procedia PDF Downloads 1273645 Expanding the Atelier: Design Lead Academic Project Using Immersive User-Generated Mobile Images and Augmented Reality
Authors: David Sinfield, Thomas Cochrane, Marcos Steagall
Abstract:
While there is much hype around the potential and development of mobile virtual reality (VR), the two key critical success factors are the ease of user experience and the development of a simple user-generated content ecosystem. Educational technology history is littered with the debris of over-hyped revolutionary new technologies that failed to gain mainstream adoption or were quickly superseded. Examples include 3D television, interactive CDROMs, Second Life, and Google Glasses. However, we argue that this is the result of curriculum design that substitutes new technologies into pre-existing pedagogical strategies that are focused upon teacher-delivered content rather than exploring new pedagogical strategies that enable student-determined learning or heutagogy. Visual Communication design based learning such as Graphic Design, Illustration, Photography and Design process is heavily based on the traditional forms of the classroom environment whereby student interaction takes place both at peer level and indeed teacher based feedback. In doing so, this makes for a healthy creative learning environment, but does raise other issue in terms of student to teacher learning ratios and reduced contact time. Such issues arise when students are away from the classroom and cannot interact with their peers and teachers and thus we see a decline in creative work from the student. Using AR and VR as a means of stimulating the students and to think beyond the limitation of the studio based classroom this paper will discuss the outcomes of a student project considering the virtual classroom and the techniques involved. The Atelier learning environment is especially suited to the Visual Communication model as it deals with the creative processing of ideas that needs to be shared in a collaborative manner. This has proven to have been a successful model over the years, in the traditional form of design education, but has more recently seen a shift in thinking as we move into a more digital model of learning and indeed away from the classical classroom structure. This study focuses on the outcomes of a student design project that employed Augmented Reality and Virtual Reality technologies in order to expand the dimensions of the classroom beyond its physical limits. Augmented Reality when integrated into the learning experience can improve the learning motivation and engagement of students. This paper will outline some of the processes used and the findings from the semester-long project that took place.Keywords: augmented reality, blogging, design in community, enhanced learning and teaching, graphic design, new technologies, virtual reality, visual communications
Procedia PDF Downloads 2383644 Conceptualizing Conflict in the Gray Zone: A Comparative Analysis of Diplomatic, Military and Political Lenses
Authors: John Hardy, Paul Lushenko
Abstract:
he twenty-first century international security order has been fraught with challenges to the credibility and stability of the post-Cold War status quo. Although the American-led international system has rarely been threatened directly by dissatisfied states, an underlying challenge to the international security order has emerged in the form of a slow-burning abnegation of small but significant aspects of the status quo. Meanwhile, those security challenges which have threatened to destabilize order in the international system have not clearly belonged to the traditional notions of diplomacy and armed conflict. Instead, the main antagonists have been both states and non-state actors, the issues have crossed national and international boundaries, and contestation has occurred in a ‘gray zone’ between peace and war. Gray zone conflicts are not easily categorized as military operations, national security policies or political strategies, because they often include elements of diplomacy, military operations, and statecraft in complex combinations. This study applies three approaches to conceptualizing the gray zone in which many contemporary conflicts take place. The first approach frames gray zone conflicts as a form of coercive diplomacy, in which armed force is used to add credibility and commitment to political threats. The second approach frames gray zone conflicts as a form of discrete military operation, in which armed force is used sparingly and is limited to a specific issue. The third approach frames gray zones conflicts as a form of proxy war, in which armed force is used by or through third parties, rather than directly between belligerents. The study finds that each approach to conceptualizing the gray zone accounts for only a narrow range of issues which fall within the gap between traditional notions of peace and war. However, in combination, all three approaches are useful in explicating the gray zone and understanding the character of contemporary security challenges which defy simple categorization. These findings suggest that coercive diplomacy, discrete military operations, and proxy warfare provide three overlapping lenses for conceptualizing the gray zone and for understanding the gray zone conflicts which threaten international security in the early twenty-first century.Keywords: gray zone, international security, military operations, national security, strategy
Procedia PDF Downloads 1593643 Lotus Mechanism: Validation of Deployment Mechanism Using Structural and Dynamic Analysis
Authors: Parth Prajapati, A. R. Srinivas
Abstract:
The purpose of this paper is to validate the concept of the Lotus Mechanism using Computer Aided Engineering (CAE) tools considering the statics and dynamics through actual time dependence involving inertial forces acting on the mechanism joints. For a 1.2 m mirror made of hexagonal segments, with simple harnesses and three-point supports, the maximum diameter is 400 mm, minimum segment base thickness is 1.5 mm, and maximum rib height is considered as 12 mm. Manufacturing challenges are explored for the segments using manufacturing research and development approaches to enable use of large lightweight mirrors required for the future space system.Keywords: dynamics, manufacturing, reflectors, segmentation, statics
Procedia PDF Downloads 3733642 Introducing the Concept of Sustainable Learning: Redesigning the Social Studies and Citizenship Education Curriculum in the Context of Saudi Arabia
Authors: Aiydh Aljeddani, Fran Martin
Abstract:
Sustainable human development is an essential component of a sustainable economic, social and environmental development. Addressing sustainable learning only through the addition of new teaching methods, or embedding certain approaches, is not sufficient on its own to support the goals of sustainable human development. This research project seeks to explore how the process of redesigning the current principles of curriculum based on the concept of sustainable learning could contribute to preparing a citizen who could later contribute towards sustainable human development. Multiple qualitative methodologies were employed in order to achieve the aim of this study. The main research methods were teachers’ field notes, artefacts, informal interviews (unstructured interview), a passive participant observation, a mini nominal group technique (NGT), a weekly diary, and weekly meeting. The study revealed that the integration of a curriculum for sustainable development, in addition to the use of innovative teaching approaches, highly valued by students and teachers in social studies’ sessions. This was due to the fact that it created a positive atmosphere for interaction and aroused both teachers and students’ interest. The content of the new curriculum also contributed to increasing students’ sense of shared responsibility through involving them in thinking about solutions for some global issues. This was carried out through addressing these issues through the concept of sustainable development and the theory of Thinking Activity in a Social Context (TASC). Students had interacted with sustainable development sessions intellectually and they also practically applied it through designing projects and cut-outs. Ongoing meetings and workshops to develop work between both the researcher and the teachers, and by the teachers themselves, played a vital role in implementing the new curriculum. The participation of teachers in the development of the project through working papers, exchanging experiences and introducing amendments to the students' environment was also critical in the process of implementing the new curriculum. Finally, the concept of sustainable learning can contribute to the learning outcomes much better than the current curriculum and it can better develop the learning objectives in educational institutions.Keywords: redesigning, social studies and citizenship education curriculum, sustainable learning, thinking activity in a social context
Procedia PDF Downloads 2323641 A Combination of Mesenchymal Stem Cells and Low-Intensity Ultrasound for Knee Meniscus Regeneration: A Preliminary Study
Authors: Mohammad Nasb, Muhammad Rehan, Chen Hong
Abstract:
Background Meniscus defects critically alter knee function and lead to degenerative changes. Regenerative medicine applications including stem cell transplantation have showed a promising efficacy in finding alternatives to overcome traditional treatment limitations. However, stem cell therapy remains limited due to the substantially reduced viability and inhibitory microenvironment. Since tissue growth and repair are under the control of biochemical and mechanical signals, several approaches have recently been investigated (e.g., low intensity pulsed ultrasound [LIPUS]) to promote the regeneration process. This study employed LIPUS to improve growth and osteogenic differentiation of mesenchymal stem cells derived from human embryonic stem cells to improve the regeneration of meniscus tissue. Methodology: The Mesenchymal stromal cells (MSCs) were transplanted into the epicenter of the injured meniscus in rabbits, which were randomized into two main groups: a treatment group (n=32 New Zealand rabbits) including 4 subgroups of 8 rabbits in each subgroup (LIPUS treatment, MSC treatment, LIPUS with MSC and control), and a second group (n=9) to track implanted cells and their progeny using green fluorescence protein (GFP). GFP consists of the MSC and LIPUS-MSC combination subgroups. Rabbits were then subjected to histological, immunohistochemistry, and MRI assessment. Results: The quantity of the newly regenerated tissue in the combination treatment group that had Ultrasound irradiation after mesenchymal stem cells were better at all end points. Likewise, Tissue quality scores were also greater in knees treated with both approaches compared with controls and single treatment at all end points, achieving significance at twelve and twenty-four weeks [p < 0.05], and [p = 0.008] at twelve weeks. Differentiation into type-I and II collagen-expressing cells were higher in the combination group at up to twenty-four weeks. Conclusions: the combination of mesenchymal stem cells and LIPUS showed greater adhering to the sites of meniscus injury, differentiate into cells resembling meniscal fibrochondrocytes, and improve both quality and quantity of meniscal regeneration.Keywords: stem cells, regenerative medicine, osteoarthritis, knee
Procedia PDF Downloads 1193640 Methodological Support for Teacher Training in English Language
Authors: Comfort Aina
Abstract:
Modern English, as we all know it to be a foreign language to many, will require training and re-training on the path of the teacher and learners alike. As a teacher, you cannot give that which you do not have. Teachers, many of whom are non-native speakers, are required to be competent in solving problems occurring in the teaching and learning processes. They should be conscious of up to date information about new approaches, methods, techniques as well as they should be capable in the use of information and communication technology (ICT) and, of course, should work on the improvement of their language components and competence. For teachers to be successful in these goals, they need to be encouraged and motivated. So, for EFL teachers to be successful, they are enrolled to in-service teacher training, ICT training, some of the training they undergo and the benefits accrued to it will be the focus of the paper.Keywords: training, management, method, english language, EFL teachers
Procedia PDF Downloads 1153639 A Two-Step Framework for Unsupervised Speaker Segmentation Using BIC and Artificial Neural Network
Authors: Ahmad Alwosheel, Ahmed Alqaraawi
Abstract:
This work proposes a new speaker segmentation approach for two speakers. It is an online approach that does not require a prior information about speaker models. It has two phases, a conventional approach such as unsupervised BIC-based is utilized in the first phase to detect speaker changes and train a Neural Network, while in the second phase, the output trained parameters from the Neural Network are used to predict next incoming audio stream. Using this approach, a comparable accuracy to similar BIC-based approaches is achieved with a significant improvement in terms of computation time.Keywords: artificial neural network, diarization, speaker indexing, speaker segmentation
Procedia PDF Downloads 5023638 A Time-Reducible Approach to Compute Determinant |I-X|
Authors: Wang Xingbo
Abstract:
Computation of determinant in the form |I-X| is primary and fundamental because it can help to compute many other determinants. This article puts forward a time-reducible approach to compute determinant |I-X|. The approach is derived from the Newton’s identity and its time complexity is no more than that to compute the eigenvalues of the square matrix X. Mathematical deductions and numerical example are presented in detail for the approach. By comparison with classical approaches the new approach is proved to be superior to the classical ones and it can naturally reduce the computational time with the improvement of efficiency to compute eigenvalues of the square matrix.Keywords: algorithm, determinant, computation, eigenvalue, time complexity
Procedia PDF Downloads 4153637 Use of Fractal Geometry in Machine Learning
Authors: Fuad M. Alkoot
Abstract:
The main component of a machine learning system is the classifier. Classifiers are mathematical models that can perform classification tasks for a specific application area. Additionally, many classifiers are combined using any of the available methods to reduce the classifier error rate. The benefits gained from the combination of multiple classifier designs has motivated the development of diverse approaches to multiple classifiers. We aim to investigate using fractal geometry to develop an improved classifier combiner. Initially we experiment with measuring the fractal dimension of data and use the results in the development of a combiner strategy.Keywords: fractal geometry, machine learning, classifier, fractal dimension
Procedia PDF Downloads 2183636 Transformers in Gene Expression-Based Classification
Authors: Babak Forouraghi
Abstract:
A genetic circuit is a collection of interacting genes and proteins that enable individual cells to implement and perform vital biological functions such as cell division, growth, death, and signaling. In cell engineering, synthetic gene circuits are engineered networks of genes specifically designed to implement functionalities that are not evolved by nature. These engineered networks enable scientists to tackle complex problems such as engineering cells to produce therapeutics within the patient's body, altering T cells to target cancer-related antigens for treatment, improving antibody production using engineered cells, tissue engineering, and production of genetically modified plants and livestock. Construction of computational models to realize genetic circuits is an especially challenging task since it requires the discovery of flow of genetic information in complex biological systems. Building synthetic biological models is also a time-consuming process with relatively low prediction accuracy for highly complex genetic circuits. The primary goal of this study was to investigate the utility of a pre-trained bidirectional encoder transformer that can accurately predict gene expressions in genetic circuit designs. The main reason behind using transformers is their innate ability (attention mechanism) to take account of the semantic context present in long DNA chains that are heavily dependent on spatial representation of their constituent genes. Previous approaches to gene circuit design, such as CNN and RNN architectures, are unable to capture semantic dependencies in long contexts as required in most real-world applications of synthetic biology. For instance, RNN models (LSTM, GRU), although able to learn long-term dependencies, greatly suffer from vanishing gradient and low-efficiency problem when they sequentially process past states and compresses contextual information into a bottleneck with long input sequences. In other words, these architectures are not equipped with the necessary attention mechanisms to follow a long chain of genes with thousands of tokens. To address the above-mentioned limitations of previous approaches, a transformer model was built in this work as a variation to the existing DNA Bidirectional Encoder Representations from Transformers (DNABERT) model. It is shown that the proposed transformer is capable of capturing contextual information from long input sequences with attention mechanism. In a previous work on genetic circuit design, the traditional approaches to classification and regression, such as Random Forrest, Support Vector Machine, and Artificial Neural Networks, were able to achieve reasonably high R2 accuracy levels of 0.95 to 0.97. However, the transformer model utilized in this work with its attention-based mechanism, was able to achieve a perfect accuracy level of 100%. Further, it is demonstrated that the efficiency of the transformer-based gene expression classifier is not dependent on presence of large amounts of training examples, which may be difficult to compile in many real-world gene circuit designs.Keywords: transformers, generative ai, gene expression design, classification
Procedia PDF Downloads 593635 Kuwait Environmental Remediation Program: Waste Management Data Analytics for Planning and Optimization of Waste Collection
Authors: Aisha Al-Baroud
Abstract:
The United Nations Compensation Commission (UNCC), Kuwait National Focal Point (KNFP) and Kuwait Oil Company (KOC) cooperated in a joint project to undertake comprehensive and collaborative efforts to remediate 26 million m3 of crude oil contaminated soil that had resulted from the Gulf War in 1990/1991. These efforts are referred to as the Kuwait Environmental Remediation Program (KERP). KOC has developed a Total Remediation Solution (TRS) for KERP, which will guide the Remediation projects, comprises of alternative remedial solutions with treatment techniques inclusive of limited landfills for non-treatable soil materials disposal, and relies on treating certain ranges of Total Petroleum Hydrocarbon (TPH) contamination with the most appropriate remediation techniques. The KERP Remediation projects will be implemented within the KOC’s oilfields in North and South East Kuwait. The objectives of this remediation project is to clear land for field development and treat all the oil contaminated features (dry oil lakes, wet oil lakes, and oil contaminated piles) through TRS plan to optimize the treatment processes and minimize the volume of contaminated materials to be placed into landfills. The treatment strategy will comprise of Excavation and Transportation (E&T) of oil contaminated soils from contaminated land to remote treatment areas and to use appropriate remediation technologies or a combination of treatment technologies to achieve remediation target criteria (RTC). KOC has awarded five mega projects to achieve the same and is currently in the execution phase. As a part of the company’s commitment to environment and for the fulfillment of the mandatory HSSEMS procedures, all the Remediation contractors needs to report waste generation data from the various project activities on a monthly basis. Data on waste generation is collected in order to implement cost-efficient and sustainable waste management operations. Data analytics approaches can be built on the top of the data to produce more detailed, and in-time waste generation information for the basis of waste management and collection. The results obtained highlight the potential of advanced data analytic approaches in producing more detailed waste generation information for planning and optimization of waste collection and recycling.Keywords: waste, tencnolgies, KERP, data, soil
Procedia PDF Downloads 1133634 Design Architecture Anti-Corruption Commission (KPK) According to KPK Law: Strong or Weak?
Authors: Moh Rizaldi, Ali Abdurachman, Indra Perwira
Abstract:
The biggest demonstration after the 1998 reforms that took place in Indonesia for several days at the end of 2019 did not eliminate the intention of the People’s Representative Council (Dewan Perwakilan Rakyat or DPR) and the President to enact the law 19 of 2019 (KPK law). There is a central issue to be highlighted, namely whether the change is intended to strengthen or even weaken the KPK. To achieve this goal, the Analysis focuses on two agency principles namely the independent principle and the control principle as seen from three things namely the legal substance, legal structure, and legal culture. The research method is normative with conceptual, historical and statute approaches. The argument from this writing is that KPK Law has cut most of the KPK's authority as a result the KPK has become symbolic or toothless in combating corruption.Keywords: control, independent, KPK, law no. 19 of 2019
Procedia PDF Downloads 1253633 Multiscale Modeling of Damage in Textile Composites
Authors: Jaan-Willem Simon, Bertram Stier, Brett Bednarcyk, Evan Pineda, Stefanie Reese
Abstract:
Textile composites, in which the reinforcing fibers are woven or braided, have become very popular in numerous applications in aerospace, automotive, and maritime industry. These textile composites are advantageous due to their ease of manufacture, damage tolerance, and relatively low cost. However, physics-based modeling of the mechanical behavior of textile composites is challenging. Compared to their unidirectional counterparts, textile composites introduce additional geometric complexities, which cause significant local stress and strain concentrations. Since these internal concentrations are primary drivers of nonlinearity, damage, and failure within textile composites, they must be taken into account in order for the models to be predictive. The macro-scale approach to modeling textile-reinforced composites treats the whole composite as an effective, homogenized material. This approach is very computationally efficient, but it cannot be considered predictive beyond the elastic regime because the complex microstructural geometry is not considered. Further, this approach can, at best, offer a phenomenological treatment of nonlinear deformation and failure. In contrast, the mesoscale approach to modeling textile composites explicitly considers the internal geometry of the reinforcing tows, and thus, their interaction, and the effects of their curved paths can be modeled. The tows are treated as effective (homogenized) materials, requiring the use of anisotropic material models to capture their behavior. Finally, the micro-scale approach goes one level lower, modeling the individual filaments that constitute the tows. This paper will compare meso- and micro-scale approaches to modeling the deformation, damage, and failure of textile-reinforced polymer matrix composites. For the mesoscale approach, the woven composite architecture will be modeled using the finite element method, and an anisotropic damage model for the tows will be employed to capture the local nonlinear behavior. For the micro-scale, two different models will be used, the one being based on the finite element method, whereas the other one makes use of an embedded semi-analytical approach. The goal will be the comparison and evaluation of these approaches to modeling textile-reinforced composites in terms of accuracy, efficiency, and utility.Keywords: multiscale modeling, continuum damage model, damage interaction, textile composites
Procedia PDF Downloads 3543632 Twin Deficits Hypothesis: The Case of Turkey
Authors: Mehmet Mucuk, Ayşen Edirneligil
Abstract:
Budget and current account deficits are main problems for all countries. There are different approaches about the relationship between budget deficit and current account deficit. While Keynesian view accepts that there is a casual link between these variables, Ricardian equivalence hypothesis rejects it. The aim of this study is to analyze the validity of Keynesian view for Turkish Economy using VAR analysis with the monthly data in the period of 2006-2014. In this context, it will be used Johansen Cointegration Test, Impulse-Response Function and Variance Decomposition Tests.Keywords: budget deficit, current account deficit, Turkish economy, twin deficits
Procedia PDF Downloads 4253631 Social Structure, Involuntary Relations and Urban Poverty
Authors: Mahmood Niroobakhsh
Abstract:
This article deals with special structuralism approaches to explain a certain kind of social problem. Widespread presence of poverty is a reminder of deep-rooted unresolved problems of social relations. The expected role from an individual for the social system recognizes poverty derived from an interrelated social structure. By the time, enabled to act on his role in the course of social interaction, reintegration of the poor in society may take place. Poverty and housing type are reflections of the underlying social structure, primarily structure’s elements, systemic interrelations, and the overall strength or weakness of that structure. Poverty varies based on social structure in that the stronger structures are less likely to produce poverty.Keywords: absolute poverty, relative poverty, social structure, urban poverty
Procedia PDF Downloads 6793630 Less Calculations and More Stories: Improving Financial Education for Young Women
Authors: Laura de Zwaan, Tracey West
Abstract:
There is a sustained observable gender gap in financial literacy, with females consistently having lower levels than males. This research explores the knowledge and experiences of high school students in Australia aged 14 to 18 in order to understand how this gap can be improved. Using a predominantly qualitative approach, we find evidence to support impacts on financial literacy from financial socialization and socio-economic environment. We also find evidence that current teaching and assessment approaches to financial literacy may disadvantage female students. We conclude by offering recommendations to improve the way financial literacy education is delivered within the curriculum.Keywords: financial literacy, financial socialization, gender, maths
Procedia PDF Downloads 803629 Teaching and Education Science as a Way of Enhancing Student’s Skills and Employability
Authors: Nabbengo Minovia
Abstract:
Teaching and education science encompasses a broad spectrum of research and practices aimed at understanding and improving the processes of teaching and learning. This abstract explores key themes within this field, including pedagogical methodologies, educational psychology, curriculum development, and the integration of technology in education. It highlights the importance of evidence-based practices in enhancing student outcomes and fostering lifelong learning. The abstract also discusses current trends such as personalized learning, inclusive education, and the role of educators as facilitators of knowledge and critical thinking. By examining these aspects, this abstract aims to contribute to the ongoing dialogue on effective educational strategies and their impact on shaping future generations.Keywords: employability through skilling, excellence as a way to self-esteem, science as an art, skills gained through learning
Procedia PDF Downloads 273628 Social Vulnerability Mapping in New York City to Discuss Current Adaptation Practice
Authors: Diana Reckien
Abstract:
Vulnerability assessments are increasingly used to support policy-making in complex environments, like urban areas. Usually, vulnerability studies include the construction of aggregate (sub-) indices and the subsequent mapping of indices across an area of interest. Vulnerability studies show a couple of advantages: they are great communication tools, can inform a wider general debate about environmental issues, and can help allocating and efficiently targeting scarce resources for adaptation policy and planning. However, they also have a number of challenges: Vulnerability assessments are constructed on the basis of a wide range of methodologies and there is no single framework or methodology that has proven to serve best in certain environments, indicators vary highly according to the spatial scale used, different variables and metrics produce different results, and aggregate or composite vulnerability indicators that are mapped easily distort or bias the picture of vulnerability as they hide the underlying causes of vulnerability and level out conflicting reasons of vulnerability in space. So, there is urgent need to further develop the methodology of vulnerability studies towards a common framework, which is one reason of the paper. We introduce a social vulnerability approach, which is compared with other approaches of bio-physical or sectoral vulnerability studies relatively developed in terms of a common methodology for index construction, guidelines for mapping, assessment of sensitivity, and verification of variables. Two approaches are commonly pursued in the literature. The first one is an additive approach, in which all potentially influential variables are weighted according to their importance for the vulnerability aspect, and then added to form a composite vulnerability index per unit area. The second approach includes variable reduction, mostly Principal Component Analysis (PCA) that reduces the number of variables that are interrelated into a smaller number of less correlating components, which are also added to form a composite index. We test these two approaches of constructing indices on the area of New York City as well as two different metrics of variables used as input and compare the outcome for the 5 boroughs of NY. Our analysis yields that the mapping exercise yields particularly different results in the outer regions and parts of the boroughs, such as Outer Queens and Staten Island. However, some of these parts, particularly the coastal areas receive the highest attention in the current adaptation policy. We imply from this that the current adaptation policy and practice in NY might need to be discussed, as these outer urban areas show relatively low social vulnerability as compared with the more central parts, i.e. the high dense areas of Manhattan, Central Brooklyn, Central Queens and the Southern Bronx. The inner urban parts receive lesser adaptation attention, but bear a higher risk of damage in case of hazards in those areas. This is conceivable, e.g., during large heatwaves, which would more affect more the inner and poorer parts of the city as compared with the outer urban areas. In light of the recent planning practice of NY one needs to question and discuss who in NY makes adaptation policy for whom, but the presented analyses points towards an under representation of the needs of the socially vulnerable population, such as the poor, the elderly, and ethnic minorities, in the current adaptation practice in New York City.Keywords: vulnerability mapping, social vulnerability, additive approach, Principal Component Analysis (PCA), New York City, United States, adaptation, social sensitivity
Procedia PDF Downloads 3953627 A Comparative Study on the Effectiveness of Conventional Physiotherapy Program, Mobilization and Taping with Proprioceptive Training for Patellofemoral Pain Syndrome
Authors: Mahesh Mitra
Abstract:
Introduction and Purpose: Patellofemoral Pain Syndrome [PFPS] is characterized by pain or discomfort seemingly originating from the contact of posterior surface of Patella with Femur. Given the multifactorial causes and high prevalence there is a need of proper management technique. Also a more comprehensive and best possible Physiotherapy treatment approach has to be devised to enhance the performance of the individual with PFPS. Purpose of the study was to: - Prevalence of PFPS in various sports - To determine if there exists any relationship between the Body Mass Index[BMI] and Pain Intensity in the person playing a sport. - To evaluate the effect of conventional Physiotherapy program, Mobilization and Taping with Proprioceptive training on PFPS. Hypothesis 1. Prevalence is not the same with different sporting activities 2. There is a relationship between BMI and Pain intensity. 3. There is no significant difference in the improvement with the different treatment approaches. Methodology: A sample of 200 sports men were tested for the prevalence of PFPS and their anthropometric measurements were obtained to check for the correlation between BMI vs Pain intensity. Out of which 80 diagnosed cases of PFPS were allotted into three treatment groups and evaluated for Pain at rest and at activity and KUJALA scale. Group I were treated with conventional Physiotherapy that included TENS application and Exercises, Group II were treated with compression mobilization along with exercises, Group III were treated with Taping and Proprioceptive exercises. The variables Pain on rest, activity and KUJALA score were measured initially, at 1 week and at the end of 2 weeks after respective treatment. Data Analysis - Prevalence percentage of PFPS in each sport - Pearsons Correlation coefficient to find the relationship between BMI and Pain during activity. - Repeated measures analysis of variance [ANOVA] to find out the significance during Pre, Mid and Post-test difference among - Newman Kuel Post hoc Test - ANCOVA for the difference amongst group I, II and III. Results and conclusion It was concluded that PFPS was more prevalent in volley ball players [80%] followed by football and basketball [66%] players, then in hand ball and cricket players [46.6%] and 40% in tennis players. There was no relationship between BMI of the individual and Pain intensity. All the three treatment approaches were effective whereas mobilization and taping were more effective than Conventional Physiotherapy program.Keywords: PFPS, KUJALA score, mobilization, proprioceptive training
Procedia PDF Downloads 315