Search results for: Lagrangian approaches
3336 Numerical Simulation of Filtration Gas Combustion: Front Propagation Velocity
Authors: Yuri Laevsky, Tatyana Nosova
Abstract:
The phenomenon of filtration gas combustion (FGC) had been discovered experimentally at the beginning of 80’s of the previous century. It has a number of important applications in such areas as chemical technologies, fire-explosion safety, energy-saving technologies, oil production. From the physical point of view, FGC may be defined as the propagation of region of gaseous exothermic reaction in chemically inert porous medium, as the gaseous reactants seep into the region of chemical transformation. The movement of the combustion front has different modes, and this investigation is focused on the low-velocity regime. The main characteristic of the process is the velocity of the combustion front propagation. Computation of this characteristic encounters substantial difficulties because of the strong heterogeneity of the process. The mathematical model of FGC is formed by the energy conservation laws for the temperature of the porous medium and the temperature of gas and the mass conservation law for the relative concentration of the reacting component of the gas mixture. In this case the homogenization of the model is performed with the use of the two-temperature approach when at each point of the continuous medium we specify the solid and gas phases with a Newtonian heat exchange between them. The construction of a computational scheme is based on the principles of mixed finite element method with the usage of a regular mesh. The approximation in time is performed by an explicit–implicit difference scheme. Special attention was given to determination of the combustion front propagation velocity. Straight computation of the velocity as grid derivative leads to extremely unstable algorithm. It is worth to note that the term ‘front propagation velocity’ makes sense for settled motion when some analytical formulae linking velocity and equilibrium temperature are correct. The numerical implementation of one of such formulae leading to the stable computation of instantaneous front velocity has been proposed. The algorithm obtained has been applied in subsequent numerical investigation of the FGC process. This way the dependence of the main characteristics of the process on various physical parameters has been studied. In particular, the influence of the combustible gas mixture consumption on the front propagation velocity has been investigated. It also has been reaffirmed numerically that there is an interval of critical values of the interfacial heat transfer coefficient at which a sort of a breakdown occurs from a slow combustion front propagation to a rapid one. Approximate boundaries of such an interval have been calculated for some specific parameters. All the results obtained are in full agreement with both experimental and theoretical data, confirming the adequacy of the model and the algorithm constructed. The presence of stable techniques to calculate the instantaneous velocity of the combustion wave allows considering the semi-Lagrangian approach to the solution of the problem.Keywords: filtration gas combustion, low-velocity regime, mixed finite element method, numerical simulation
Procedia PDF Downloads 3013335 Automated Adaptions of Semantic User- and Service Profile Representations by Learning the User Context
Authors: Nicole Merkle, Stefan Zander
Abstract:
Ambient Assisted Living (AAL) describes a technological and methodological stack of (e.g. formal model-theoretic semantics, rule-based reasoning and machine learning), different aspects regarding the behavior, activities and characteristics of humans. Hence, a semantic representation of the user environment and its relevant elements are required in order to allow assistive agents to recognize situations and deduce appropriate actions. Furthermore, the user and his/her characteristics (e.g. physical, cognitive, preferences) need to be represented with a high degree of expressiveness in order to allow software agents a precise evaluation of the users’ context models. The correct interpretation of these context models highly depends on temporal, spatial circumstances as well as individual user preferences. In most AAL approaches, model representations of real world situations represent the current state of a universe of discourse at a given point in time by neglecting transitions between a set of states. However, the AAL domain currently lacks sufficient approaches that contemplate on the dynamic adaptions of context-related representations. Semantic representations of relevant real-world excerpts (e.g. user activities) help cognitive, rule-based agents to reason and make decisions in order to help users in appropriate tasks and situations. Furthermore, rules and reasoning on semantic models are not sufficient for handling uncertainty and fuzzy situations. A certain situation can require different (re-)actions in order to achieve the best results with respect to the user and his/her needs. But what is the best result? To answer this question, we need to consider that every smart agent requires to achieve an objective, but this objective is mostly defined by domain experts who can also fail in their estimation of what is desired by the user and what not. Hence, a smart agent has to be able to learn from context history data and estimate or predict what is most likely in certain contexts. Furthermore, different agents with contrary objectives can cause collisions as their actions influence the user’s context and constituting conditions in unintended or uncontrolled ways. We present an approach for dynamically updating a semantic model with respect to the current user context that allows flexibility of the software agents and enhances their conformance in order to improve the user experience. The presented approach adapts rules by learning sensor evidence and user actions using probabilistic reasoning approaches, based on given expert knowledge. The semantic domain model consists basically of device-, service- and user profile representations. In this paper, we present how this semantic domain model can be used in order to compute the probability of matching rules and actions. We apply this probability estimation to compare the current domain model representation with the computed one in order to adapt the formal semantic representation. Our approach aims at minimizing the likelihood of unintended interferences in order to eliminate conflicts and unpredictable side-effects by updating pre-defined expert knowledge according to the most probable context representation. This enables agents to adapt to dynamic changes in the environment which enhances the provision of adequate assistance and affects positively the user satisfaction.Keywords: ambient intelligence, machine learning, semantic web, software agents
Procedia PDF Downloads 2813334 The Impact of Building Technologies on Local Identity of Urban Settlements
Authors: Eman Nagi Gowid Selim
Abstract:
Nowadays, the relevance of places to people has been questioned from different perspectives. This is attributed to the fact that many international concrete blocks were used to create multi-use public spaces in neighborhoods based on the techniques of mass-productions concepts that became one of the most effective ways in building construction, replacing the local and traditional built environment. During the last decades, the world has become increasingly globalized and citizen more mobilized, and thus, ignoring the social and environmental dimensions of the local identity. The main enquiries of the research are “How did building technologies affect urban settlement’s identity?” and “What are the impacts of technologies and globalization on local identities in urban spaces? “From this perspective, the research presents firstly, a historical review that shows how old civilizations enhance their local identities using the newly discovered building materials in each era in different urban settlement and fabrics without losing the identity. The second part of the research highlights the different approaches of building technologies and urban design to present a clear understanding of ways of applying and merging between different methodologies to achieve the most efficient urban space design. The third part aims at analyzing some international and national case studies where the form and structure of particular spaces are vital to identifying the morphological elements of urban settlements and the links existing between them. In addition, it determines how the building materials are used to enrich the vocabulary of the local identity. This part ends with the deduction of the guidelines for the integration of the environmental and social dimensions within the building technologies` approaches to enhance the sustainability of local identities and thus, ending up with redefining "Urban Identity" to guide future research in such cultural areas. Finally, the research uses the comparative methodology for applying the deduced guidelines on a national case study namely “Othman`s Towers” in corniche El Maadi, and then ends up by some results in the form of strategies for future researcher, that identifies how to ensure local identity in urban settlements using new building materials and technologies to achieve social and environmental comfort within the cultural areas.Keywords: building technologies, cultural context, environmental approach, participatory design, social dimension, urban spaces
Procedia PDF Downloads 3043333 Video Summarization: Techniques and Applications
Authors: Zaynab El Khattabi, Youness Tabii, Abdelhamid Benkaddour
Abstract:
Nowadays, huge amount of multimedia repositories make the browsing, retrieval and delivery of video contents very slow and even difficult tasks. Video summarization has been proposed to improve faster browsing of large video collections and more efficient content indexing and access. In this paper, we focus on approaches to video summarization. The video summaries can be generated in many different forms. However, two fundamentals ways to generate summaries are static and dynamic. We present different techniques for each mode in the literature and describe some features used for generating video summaries. We conclude with perspective for further research.Keywords: video summarization, static summarization, video skimming, semantic features
Procedia PDF Downloads 4013332 Teaching Physics: History, Models, and Transformation of Physics Education Research
Authors: N. Didiş Körhasan, D. Kaltakçı Gürel
Abstract:
Many students have difficulty in learning physics from elementary to university level. In addition, students' expectancy, attitude, and motivation may be influenced negatively with their experience (failure) and prejudice about physics learning. For this reason, physics educators, who are also physics teachers, search for the best ways to make students' learning of physics easier by considering cognitive, affective, and psychomotor issues in learning. This research critically discusses the history of physics education, fundamental pedagogical approaches, and models to teach physics, and transformation of physics education with recent research.Keywords: pedagogy, physics, physics education, science education
Procedia PDF Downloads 2643331 Detect Circles in Image: Using Statistical Image Analysis
Authors: Fathi M. O. Hamed, Salma F. Elkofhaifee
Abstract:
The aim of this work is to detect geometrical shape objects in an image. In this paper, the object is considered to be as a circle shape. The identification requires find three characteristics, which are number, size, and location of the object. To achieve the goal of this work, this paper presents an algorithm that combines from some of statistical approaches and image analysis techniques. This algorithm has been implemented to arrive at the major objectives in this paper. The algorithm has been evaluated by using simulated data, and yields good results, and then it has been applied to real data.Keywords: image processing, median filter, projection, scale-space, segmentation, threshold
Procedia PDF Downloads 4323330 Psychoanalytical Foreshadowing: The Application of a Literary Device in Quranic Narratology
Authors: Fateme Montazeri
Abstract:
Literary approaches towards the text of the Quran predate the modern period. Suyuti (d.1505)’s encyclopedia of Quranic sciences, Al-Itqan, provides a notable example. In the modern era, the study of the Quranic rhetorics received particular attention in the second half of the twentieth century by Egyptian scholars. Amin Al-Khouli (d. 1966), who might be considered the first to argue for the necessity of applying a literary-rhetorical lens toward the tafseer, Islamic exegesis, and his students championed the literary analysis as the most effective approach to the comprehension of the holy text. Western scholars continued the literary criticism of the Islamic scripture by applying to the Quran similar methodologies used in biblical studies. In the history of the literary examination of the Quran, the scope of the critical methods applied to the Quranic text has been limited. For, the rhetorical approaches to the Quran, in the premodern as well as the modern period, concerned almost exclusively with the lexical layer of the text, leaving the narratological dimensions insufficiently examined. Recent contributions, by Leyla Ozgur Alhassen, for instance, attempt to fill this lacunae. This paper aims at advancing the studies of the Quranic narratives by investigating the application of a literary device whose role in the Quranic stories remains unstudied, that is, “foreshadowing.” This paper shall focus on Chapter 12, “Surah al-Yusuf,” as its case study. Chapter 12, the single chapter that includes the story of Joseph in one piece, contains several instances in which the events of the story are foreshadowed. As shall be discussed, foreshadowing occurs either through a monolog or dialogue whereby one or more of the characters allude to the future happenings or through the manner in which the setting is described. Through a close reading of the text, it will be demonstrated that the usage of the rhetorical tool of foreshadowing meets a dual purpose: on the one hand, foreshadowing prepares the reader/audience for the upcoming events in the plot, and on the other hand, it highlights the psychological dimensions of the characters, their thoughts, intentions, and disposition. In analyzing the story, this study shall draw on psychoanalytical criticism to explore the layers of meanings embedded in the Quranic narrative that are unfolded through foreshadowing.Keywords: foreshadowing, quranic narrative, literary criticism, surah yusuf
Procedia PDF Downloads 1533329 Survey Paper on Graph Coloring Problem and Its Application
Authors: Prateek Chharia, Biswa Bhusan Ghosh
Abstract:
Graph coloring is one of the prominent concepts in graph coloring. It can be defined as a coloring of the various regions of the graph such that all the constraints are fulfilled. In this paper various graphs coloring approaches like greedy coloring, Heuristic search for maximum independent set and graph coloring using edge table is described. Graph coloring can be used in various real time applications like student time tabling generation, Sudoku as a graph coloring problem, GSM phone network.Keywords: graph coloring, greedy coloring, heuristic search, edge table, sudoku as a graph coloring problem
Procedia PDF Downloads 5393328 The Effect of Bihemisferic Transcranial Direct Current Stimulation Therapy on Upper Extremity Motor Functions in Stroke Patients
Authors: Dilek Cetin Alisar, Oya Umit Yemisci, Selin Ozen, Seyhan Sozay
Abstract:
New approaches and treatment modalities are being developed to make patients more functional and independent in stroke rehabilitation. One of these approaches is transcranial direct stimulation therapy (tDCS), which aims to improve the hemiplegic upper limb function of stroke patients. tDCS therapy is not in the routine rehabilitation program; however, the studies about tDCS therapy on stroke rehabilitation was increased in recent years. Evaluate the effect of tDCS treatment on upper extremity motor function in patients with subacute stroke was aimed in our study. 32 stroke patients (16 tDCS group, 16 sham groups) who were hospitalized for rehabilitation in Başkent University Physical Medicine and Rehabilitation Clinic between 01.08.2016-20.01-2018 were included in the study. The conventional upper limb rehabilitation program was used for both tDCS and control group patients for 3 weeks, 5 days a week, for 60-120 minutes a day. In addition to the conventional stroke rehabilitation program in the tDAS group, bihemispheric tDCS was administered for 30 minutes daily. Patients were evaluated before treatment and after 1 week of treatment. Functional independence measure self-care score (FIM), Brunnstorm Recovery Stage (BRS), and Fugl-Meyer (FM) upper extremity motor function scale were used. There was no difference in demographic characteristics between the groups. There were no significant differences between BRS and FM scores in two groups, but there was a significant difference FIM score (p=0.05. FIM, BRS, and FM scores are significantly in the tDCS group, when before therapy and after 1 week of therapy, however, no difference is found in the shame group (p < 0,001). When FBS and FM scores were compared, there were statistical significant differences in tDCS group (p < 0,001). In conclusion, this randomized double-blind study showed that bihemispheric tDCS treatment was found to be superior to upper extremity motor and functional enhancement in addition to conventional rehabilitation methods in subacute stroke patients. In order for tDCS therapy to be used routinely in stroke rehabilitation, there is a need for more comprehensive, long-termed, randomized controlled clinical trials in order to find answers to many questions, such as the duration and intensity of treatment.Keywords: cortical stimulation, motor function, rehabilitation, stroke
Procedia PDF Downloads 1273327 Conceptualizing Conflict in the Gray Zone: A Comparative Analysis of Diplomatic, Military and Political Lenses
Authors: John Hardy, Paul Lushenko
Abstract:
he twenty-first century international security order has been fraught with challenges to the credibility and stability of the post-Cold War status quo. Although the American-led international system has rarely been threatened directly by dissatisfied states, an underlying challenge to the international security order has emerged in the form of a slow-burning abnegation of small but significant aspects of the status quo. Meanwhile, those security challenges which have threatened to destabilize order in the international system have not clearly belonged to the traditional notions of diplomacy and armed conflict. Instead, the main antagonists have been both states and non-state actors, the issues have crossed national and international boundaries, and contestation has occurred in a ‘gray zone’ between peace and war. Gray zone conflicts are not easily categorized as military operations, national security policies or political strategies, because they often include elements of diplomacy, military operations, and statecraft in complex combinations. This study applies three approaches to conceptualizing the gray zone in which many contemporary conflicts take place. The first approach frames gray zone conflicts as a form of coercive diplomacy, in which armed force is used to add credibility and commitment to political threats. The second approach frames gray zone conflicts as a form of discrete military operation, in which armed force is used sparingly and is limited to a specific issue. The third approach frames gray zones conflicts as a form of proxy war, in which armed force is used by or through third parties, rather than directly between belligerents. The study finds that each approach to conceptualizing the gray zone accounts for only a narrow range of issues which fall within the gap between traditional notions of peace and war. However, in combination, all three approaches are useful in explicating the gray zone and understanding the character of contemporary security challenges which defy simple categorization. These findings suggest that coercive diplomacy, discrete military operations, and proxy warfare provide three overlapping lenses for conceptualizing the gray zone and for understanding the gray zone conflicts which threaten international security in the early twenty-first century.Keywords: gray zone, international security, military operations, national security, strategy
Procedia PDF Downloads 1583326 Lotus Mechanism: Validation of Deployment Mechanism Using Structural and Dynamic Analysis
Authors: Parth Prajapati, A. R. Srinivas
Abstract:
The purpose of this paper is to validate the concept of the Lotus Mechanism using Computer Aided Engineering (CAE) tools considering the statics and dynamics through actual time dependence involving inertial forces acting on the mechanism joints. For a 1.2 m mirror made of hexagonal segments, with simple harnesses and three-point supports, the maximum diameter is 400 mm, minimum segment base thickness is 1.5 mm, and maximum rib height is considered as 12 mm. Manufacturing challenges are explored for the segments using manufacturing research and development approaches to enable use of large lightweight mirrors required for the future space system.Keywords: dynamics, manufacturing, reflectors, segmentation, statics
Procedia PDF Downloads 3733325 Introducing the Concept of Sustainable Learning: Redesigning the Social Studies and Citizenship Education Curriculum in the Context of Saudi Arabia
Authors: Aiydh Aljeddani, Fran Martin
Abstract:
Sustainable human development is an essential component of a sustainable economic, social and environmental development. Addressing sustainable learning only through the addition of new teaching methods, or embedding certain approaches, is not sufficient on its own to support the goals of sustainable human development. This research project seeks to explore how the process of redesigning the current principles of curriculum based on the concept of sustainable learning could contribute to preparing a citizen who could later contribute towards sustainable human development. Multiple qualitative methodologies were employed in order to achieve the aim of this study. The main research methods were teachers’ field notes, artefacts, informal interviews (unstructured interview), a passive participant observation, a mini nominal group technique (NGT), a weekly diary, and weekly meeting. The study revealed that the integration of a curriculum for sustainable development, in addition to the use of innovative teaching approaches, highly valued by students and teachers in social studies’ sessions. This was due to the fact that it created a positive atmosphere for interaction and aroused both teachers and students’ interest. The content of the new curriculum also contributed to increasing students’ sense of shared responsibility through involving them in thinking about solutions for some global issues. This was carried out through addressing these issues through the concept of sustainable development and the theory of Thinking Activity in a Social Context (TASC). Students had interacted with sustainable development sessions intellectually and they also practically applied it through designing projects and cut-outs. Ongoing meetings and workshops to develop work between both the researcher and the teachers, and by the teachers themselves, played a vital role in implementing the new curriculum. The participation of teachers in the development of the project through working papers, exchanging experiences and introducing amendments to the students' environment was also critical in the process of implementing the new curriculum. Finally, the concept of sustainable learning can contribute to the learning outcomes much better than the current curriculum and it can better develop the learning objectives in educational institutions.Keywords: redesigning, social studies and citizenship education curriculum, sustainable learning, thinking activity in a social context
Procedia PDF Downloads 2313324 A Combination of Mesenchymal Stem Cells and Low-Intensity Ultrasound for Knee Meniscus Regeneration: A Preliminary Study
Authors: Mohammad Nasb, Muhammad Rehan, Chen Hong
Abstract:
Background Meniscus defects critically alter knee function and lead to degenerative changes. Regenerative medicine applications including stem cell transplantation have showed a promising efficacy in finding alternatives to overcome traditional treatment limitations. However, stem cell therapy remains limited due to the substantially reduced viability and inhibitory microenvironment. Since tissue growth and repair are under the control of biochemical and mechanical signals, several approaches have recently been investigated (e.g., low intensity pulsed ultrasound [LIPUS]) to promote the regeneration process. This study employed LIPUS to improve growth and osteogenic differentiation of mesenchymal stem cells derived from human embryonic stem cells to improve the regeneration of meniscus tissue. Methodology: The Mesenchymal stromal cells (MSCs) were transplanted into the epicenter of the injured meniscus in rabbits, which were randomized into two main groups: a treatment group (n=32 New Zealand rabbits) including 4 subgroups of 8 rabbits in each subgroup (LIPUS treatment, MSC treatment, LIPUS with MSC and control), and a second group (n=9) to track implanted cells and their progeny using green fluorescence protein (GFP). GFP consists of the MSC and LIPUS-MSC combination subgroups. Rabbits were then subjected to histological, immunohistochemistry, and MRI assessment. Results: The quantity of the newly regenerated tissue in the combination treatment group that had Ultrasound irradiation after mesenchymal stem cells were better at all end points. Likewise, Tissue quality scores were also greater in knees treated with both approaches compared with controls and single treatment at all end points, achieving significance at twelve and twenty-four weeks [p < 0.05], and [p = 0.008] at twelve weeks. Differentiation into type-I and II collagen-expressing cells were higher in the combination group at up to twenty-four weeks. Conclusions: the combination of mesenchymal stem cells and LIPUS showed greater adhering to the sites of meniscus injury, differentiate into cells resembling meniscal fibrochondrocytes, and improve both quality and quantity of meniscal regeneration.Keywords: stem cells, regenerative medicine, osteoarthritis, knee
Procedia PDF Downloads 1193323 Methodological Support for Teacher Training in English Language
Authors: Comfort Aina
Abstract:
Modern English, as we all know it to be a foreign language to many, will require training and re-training on the path of the teacher and learners alike. As a teacher, you cannot give that which you do not have. Teachers, many of whom are non-native speakers, are required to be competent in solving problems occurring in the teaching and learning processes. They should be conscious of up to date information about new approaches, methods, techniques as well as they should be capable in the use of information and communication technology (ICT) and, of course, should work on the improvement of their language components and competence. For teachers to be successful in these goals, they need to be encouraged and motivated. So, for EFL teachers to be successful, they are enrolled to in-service teacher training, ICT training, some of the training they undergo and the benefits accrued to it will be the focus of the paper.Keywords: training, management, method, english language, EFL teachers
Procedia PDF Downloads 1143322 A Two-Step Framework for Unsupervised Speaker Segmentation Using BIC and Artificial Neural Network
Authors: Ahmad Alwosheel, Ahmed Alqaraawi
Abstract:
This work proposes a new speaker segmentation approach for two speakers. It is an online approach that does not require a prior information about speaker models. It has two phases, a conventional approach such as unsupervised BIC-based is utilized in the first phase to detect speaker changes and train a Neural Network, while in the second phase, the output trained parameters from the Neural Network are used to predict next incoming audio stream. Using this approach, a comparable accuracy to similar BIC-based approaches is achieved with a significant improvement in terms of computation time.Keywords: artificial neural network, diarization, speaker indexing, speaker segmentation
Procedia PDF Downloads 5023321 A Time-Reducible Approach to Compute Determinant |I-X|
Authors: Wang Xingbo
Abstract:
Computation of determinant in the form |I-X| is primary and fundamental because it can help to compute many other determinants. This article puts forward a time-reducible approach to compute determinant |I-X|. The approach is derived from the Newton’s identity and its time complexity is no more than that to compute the eigenvalues of the square matrix X. Mathematical deductions and numerical example are presented in detail for the approach. By comparison with classical approaches the new approach is proved to be superior to the classical ones and it can naturally reduce the computational time with the improvement of efficiency to compute eigenvalues of the square matrix.Keywords: algorithm, determinant, computation, eigenvalue, time complexity
Procedia PDF Downloads 4153320 Use of Fractal Geometry in Machine Learning
Authors: Fuad M. Alkoot
Abstract:
The main component of a machine learning system is the classifier. Classifiers are mathematical models that can perform classification tasks for a specific application area. Additionally, many classifiers are combined using any of the available methods to reduce the classifier error rate. The benefits gained from the combination of multiple classifier designs has motivated the development of diverse approaches to multiple classifiers. We aim to investigate using fractal geometry to develop an improved classifier combiner. Initially we experiment with measuring the fractal dimension of data and use the results in the development of a combiner strategy.Keywords: fractal geometry, machine learning, classifier, fractal dimension
Procedia PDF Downloads 2163319 Transformers in Gene Expression-Based Classification
Authors: Babak Forouraghi
Abstract:
A genetic circuit is a collection of interacting genes and proteins that enable individual cells to implement and perform vital biological functions such as cell division, growth, death, and signaling. In cell engineering, synthetic gene circuits are engineered networks of genes specifically designed to implement functionalities that are not evolved by nature. These engineered networks enable scientists to tackle complex problems such as engineering cells to produce therapeutics within the patient's body, altering T cells to target cancer-related antigens for treatment, improving antibody production using engineered cells, tissue engineering, and production of genetically modified plants and livestock. Construction of computational models to realize genetic circuits is an especially challenging task since it requires the discovery of flow of genetic information in complex biological systems. Building synthetic biological models is also a time-consuming process with relatively low prediction accuracy for highly complex genetic circuits. The primary goal of this study was to investigate the utility of a pre-trained bidirectional encoder transformer that can accurately predict gene expressions in genetic circuit designs. The main reason behind using transformers is their innate ability (attention mechanism) to take account of the semantic context present in long DNA chains that are heavily dependent on spatial representation of their constituent genes. Previous approaches to gene circuit design, such as CNN and RNN architectures, are unable to capture semantic dependencies in long contexts as required in most real-world applications of synthetic biology. For instance, RNN models (LSTM, GRU), although able to learn long-term dependencies, greatly suffer from vanishing gradient and low-efficiency problem when they sequentially process past states and compresses contextual information into a bottleneck with long input sequences. In other words, these architectures are not equipped with the necessary attention mechanisms to follow a long chain of genes with thousands of tokens. To address the above-mentioned limitations of previous approaches, a transformer model was built in this work as a variation to the existing DNA Bidirectional Encoder Representations from Transformers (DNABERT) model. It is shown that the proposed transformer is capable of capturing contextual information from long input sequences with attention mechanism. In a previous work on genetic circuit design, the traditional approaches to classification and regression, such as Random Forrest, Support Vector Machine, and Artificial Neural Networks, were able to achieve reasonably high R2 accuracy levels of 0.95 to 0.97. However, the transformer model utilized in this work with its attention-based mechanism, was able to achieve a perfect accuracy level of 100%. Further, it is demonstrated that the efficiency of the transformer-based gene expression classifier is not dependent on presence of large amounts of training examples, which may be difficult to compile in many real-world gene circuit designs.Keywords: transformers, generative ai, gene expression design, classification
Procedia PDF Downloads 593318 Kuwait Environmental Remediation Program: Waste Management Data Analytics for Planning and Optimization of Waste Collection
Authors: Aisha Al-Baroud
Abstract:
The United Nations Compensation Commission (UNCC), Kuwait National Focal Point (KNFP) and Kuwait Oil Company (KOC) cooperated in a joint project to undertake comprehensive and collaborative efforts to remediate 26 million m3 of crude oil contaminated soil that had resulted from the Gulf War in 1990/1991. These efforts are referred to as the Kuwait Environmental Remediation Program (KERP). KOC has developed a Total Remediation Solution (TRS) for KERP, which will guide the Remediation projects, comprises of alternative remedial solutions with treatment techniques inclusive of limited landfills for non-treatable soil materials disposal, and relies on treating certain ranges of Total Petroleum Hydrocarbon (TPH) contamination with the most appropriate remediation techniques. The KERP Remediation projects will be implemented within the KOC’s oilfields in North and South East Kuwait. The objectives of this remediation project is to clear land for field development and treat all the oil contaminated features (dry oil lakes, wet oil lakes, and oil contaminated piles) through TRS plan to optimize the treatment processes and minimize the volume of contaminated materials to be placed into landfills. The treatment strategy will comprise of Excavation and Transportation (E&T) of oil contaminated soils from contaminated land to remote treatment areas and to use appropriate remediation technologies or a combination of treatment technologies to achieve remediation target criteria (RTC). KOC has awarded five mega projects to achieve the same and is currently in the execution phase. As a part of the company’s commitment to environment and for the fulfillment of the mandatory HSSEMS procedures, all the Remediation contractors needs to report waste generation data from the various project activities on a monthly basis. Data on waste generation is collected in order to implement cost-efficient and sustainable waste management operations. Data analytics approaches can be built on the top of the data to produce more detailed, and in-time waste generation information for the basis of waste management and collection. The results obtained highlight the potential of advanced data analytic approaches in producing more detailed waste generation information for planning and optimization of waste collection and recycling.Keywords: waste, tencnolgies, KERP, data, soil
Procedia PDF Downloads 1133317 Design Architecture Anti-Corruption Commission (KPK) According to KPK Law: Strong or Weak?
Authors: Moh Rizaldi, Ali Abdurachman, Indra Perwira
Abstract:
The biggest demonstration after the 1998 reforms that took place in Indonesia for several days at the end of 2019 did not eliminate the intention of the People’s Representative Council (Dewan Perwakilan Rakyat or DPR) and the President to enact the law 19 of 2019 (KPK law). There is a central issue to be highlighted, namely whether the change is intended to strengthen or even weaken the KPK. To achieve this goal, the Analysis focuses on two agency principles namely the independent principle and the control principle as seen from three things namely the legal substance, legal structure, and legal culture. The research method is normative with conceptual, historical and statute approaches. The argument from this writing is that KPK Law has cut most of the KPK's authority as a result the KPK has become symbolic or toothless in combating corruption.Keywords: control, independent, KPK, law no. 19 of 2019
Procedia PDF Downloads 1253316 Multiscale Modeling of Damage in Textile Composites
Authors: Jaan-Willem Simon, Bertram Stier, Brett Bednarcyk, Evan Pineda, Stefanie Reese
Abstract:
Textile composites, in which the reinforcing fibers are woven or braided, have become very popular in numerous applications in aerospace, automotive, and maritime industry. These textile composites are advantageous due to their ease of manufacture, damage tolerance, and relatively low cost. However, physics-based modeling of the mechanical behavior of textile composites is challenging. Compared to their unidirectional counterparts, textile composites introduce additional geometric complexities, which cause significant local stress and strain concentrations. Since these internal concentrations are primary drivers of nonlinearity, damage, and failure within textile composites, they must be taken into account in order for the models to be predictive. The macro-scale approach to modeling textile-reinforced composites treats the whole composite as an effective, homogenized material. This approach is very computationally efficient, but it cannot be considered predictive beyond the elastic regime because the complex microstructural geometry is not considered. Further, this approach can, at best, offer a phenomenological treatment of nonlinear deformation and failure. In contrast, the mesoscale approach to modeling textile composites explicitly considers the internal geometry of the reinforcing tows, and thus, their interaction, and the effects of their curved paths can be modeled. The tows are treated as effective (homogenized) materials, requiring the use of anisotropic material models to capture their behavior. Finally, the micro-scale approach goes one level lower, modeling the individual filaments that constitute the tows. This paper will compare meso- and micro-scale approaches to modeling the deformation, damage, and failure of textile-reinforced polymer matrix composites. For the mesoscale approach, the woven composite architecture will be modeled using the finite element method, and an anisotropic damage model for the tows will be employed to capture the local nonlinear behavior. For the micro-scale, two different models will be used, the one being based on the finite element method, whereas the other one makes use of an embedded semi-analytical approach. The goal will be the comparison and evaluation of these approaches to modeling textile-reinforced composites in terms of accuracy, efficiency, and utility.Keywords: multiscale modeling, continuum damage model, damage interaction, textile composites
Procedia PDF Downloads 3543315 Twin Deficits Hypothesis: The Case of Turkey
Authors: Mehmet Mucuk, Ayşen Edirneligil
Abstract:
Budget and current account deficits are main problems for all countries. There are different approaches about the relationship between budget deficit and current account deficit. While Keynesian view accepts that there is a casual link between these variables, Ricardian equivalence hypothesis rejects it. The aim of this study is to analyze the validity of Keynesian view for Turkish Economy using VAR analysis with the monthly data in the period of 2006-2014. In this context, it will be used Johansen Cointegration Test, Impulse-Response Function and Variance Decomposition Tests.Keywords: budget deficit, current account deficit, Turkish economy, twin deficits
Procedia PDF Downloads 4253314 Social Structure, Involuntary Relations and Urban Poverty
Authors: Mahmood Niroobakhsh
Abstract:
This article deals with special structuralism approaches to explain a certain kind of social problem. Widespread presence of poverty is a reminder of deep-rooted unresolved problems of social relations. The expected role from an individual for the social system recognizes poverty derived from an interrelated social structure. By the time, enabled to act on his role in the course of social interaction, reintegration of the poor in society may take place. Poverty and housing type are reflections of the underlying social structure, primarily structure’s elements, systemic interrelations, and the overall strength or weakness of that structure. Poverty varies based on social structure in that the stronger structures are less likely to produce poverty.Keywords: absolute poverty, relative poverty, social structure, urban poverty
Procedia PDF Downloads 6793313 Less Calculations and More Stories: Improving Financial Education for Young Women
Authors: Laura de Zwaan, Tracey West
Abstract:
There is a sustained observable gender gap in financial literacy, with females consistently having lower levels than males. This research explores the knowledge and experiences of high school students in Australia aged 14 to 18 in order to understand how this gap can be improved. Using a predominantly qualitative approach, we find evidence to support impacts on financial literacy from financial socialization and socio-economic environment. We also find evidence that current teaching and assessment approaches to financial literacy may disadvantage female students. We conclude by offering recommendations to improve the way financial literacy education is delivered within the curriculum.Keywords: financial literacy, financial socialization, gender, maths
Procedia PDF Downloads 803312 Social Vulnerability Mapping in New York City to Discuss Current Adaptation Practice
Authors: Diana Reckien
Abstract:
Vulnerability assessments are increasingly used to support policy-making in complex environments, like urban areas. Usually, vulnerability studies include the construction of aggregate (sub-) indices and the subsequent mapping of indices across an area of interest. Vulnerability studies show a couple of advantages: they are great communication tools, can inform a wider general debate about environmental issues, and can help allocating and efficiently targeting scarce resources for adaptation policy and planning. However, they also have a number of challenges: Vulnerability assessments are constructed on the basis of a wide range of methodologies and there is no single framework or methodology that has proven to serve best in certain environments, indicators vary highly according to the spatial scale used, different variables and metrics produce different results, and aggregate or composite vulnerability indicators that are mapped easily distort or bias the picture of vulnerability as they hide the underlying causes of vulnerability and level out conflicting reasons of vulnerability in space. So, there is urgent need to further develop the methodology of vulnerability studies towards a common framework, which is one reason of the paper. We introduce a social vulnerability approach, which is compared with other approaches of bio-physical or sectoral vulnerability studies relatively developed in terms of a common methodology for index construction, guidelines for mapping, assessment of sensitivity, and verification of variables. Two approaches are commonly pursued in the literature. The first one is an additive approach, in which all potentially influential variables are weighted according to their importance for the vulnerability aspect, and then added to form a composite vulnerability index per unit area. The second approach includes variable reduction, mostly Principal Component Analysis (PCA) that reduces the number of variables that are interrelated into a smaller number of less correlating components, which are also added to form a composite index. We test these two approaches of constructing indices on the area of New York City as well as two different metrics of variables used as input and compare the outcome for the 5 boroughs of NY. Our analysis yields that the mapping exercise yields particularly different results in the outer regions and parts of the boroughs, such as Outer Queens and Staten Island. However, some of these parts, particularly the coastal areas receive the highest attention in the current adaptation policy. We imply from this that the current adaptation policy and practice in NY might need to be discussed, as these outer urban areas show relatively low social vulnerability as compared with the more central parts, i.e. the high dense areas of Manhattan, Central Brooklyn, Central Queens and the Southern Bronx. The inner urban parts receive lesser adaptation attention, but bear a higher risk of damage in case of hazards in those areas. This is conceivable, e.g., during large heatwaves, which would more affect more the inner and poorer parts of the city as compared with the outer urban areas. In light of the recent planning practice of NY one needs to question and discuss who in NY makes adaptation policy for whom, but the presented analyses points towards an under representation of the needs of the socially vulnerable population, such as the poor, the elderly, and ethnic minorities, in the current adaptation practice in New York City.Keywords: vulnerability mapping, social vulnerability, additive approach, Principal Component Analysis (PCA), New York City, United States, adaptation, social sensitivity
Procedia PDF Downloads 3953311 A Comparative Study on the Effectiveness of Conventional Physiotherapy Program, Mobilization and Taping with Proprioceptive Training for Patellofemoral Pain Syndrome
Authors: Mahesh Mitra
Abstract:
Introduction and Purpose: Patellofemoral Pain Syndrome [PFPS] is characterized by pain or discomfort seemingly originating from the contact of posterior surface of Patella with Femur. Given the multifactorial causes and high prevalence there is a need of proper management technique. Also a more comprehensive and best possible Physiotherapy treatment approach has to be devised to enhance the performance of the individual with PFPS. Purpose of the study was to: - Prevalence of PFPS in various sports - To determine if there exists any relationship between the Body Mass Index[BMI] and Pain Intensity in the person playing a sport. - To evaluate the effect of conventional Physiotherapy program, Mobilization and Taping with Proprioceptive training on PFPS. Hypothesis 1. Prevalence is not the same with different sporting activities 2. There is a relationship between BMI and Pain intensity. 3. There is no significant difference in the improvement with the different treatment approaches. Methodology: A sample of 200 sports men were tested for the prevalence of PFPS and their anthropometric measurements were obtained to check for the correlation between BMI vs Pain intensity. Out of which 80 diagnosed cases of PFPS were allotted into three treatment groups and evaluated for Pain at rest and at activity and KUJALA scale. Group I were treated with conventional Physiotherapy that included TENS application and Exercises, Group II were treated with compression mobilization along with exercises, Group III were treated with Taping and Proprioceptive exercises. The variables Pain on rest, activity and KUJALA score were measured initially, at 1 week and at the end of 2 weeks after respective treatment. Data Analysis - Prevalence percentage of PFPS in each sport - Pearsons Correlation coefficient to find the relationship between BMI and Pain during activity. - Repeated measures analysis of variance [ANOVA] to find out the significance during Pre, Mid and Post-test difference among - Newman Kuel Post hoc Test - ANCOVA for the difference amongst group I, II and III. Results and conclusion It was concluded that PFPS was more prevalent in volley ball players [80%] followed by football and basketball [66%] players, then in hand ball and cricket players [46.6%] and 40% in tennis players. There was no relationship between BMI of the individual and Pain intensity. All the three treatment approaches were effective whereas mobilization and taping were more effective than Conventional Physiotherapy program.Keywords: PFPS, KUJALA score, mobilization, proprioceptive training
Procedia PDF Downloads 3153310 Developing a Group Guidance Framework: A Review of Literature
Authors: Abdul Rawuf Hussein, Rusnani Abdul Kadir, Mona Adlina Binti Adanan
Abstract:
Guidance program has been an essential approach in helping professions from many institutions of learning as well as communities, organizations, and clinical settings. Although the term varies depending on the approaches, objectives, and theories, the core and central element is typically developmental in nature. In this conceptual paper, the researcher will review literature on the concept of group guidance, its impact on students’ and individual’s development, developing a guidance module and proposing a synthesised framework for group guidance program.Keywords: concept, framework, group guidance, module development
Procedia PDF Downloads 5293309 Psychiatric/Psychological Issues in the Criminal Courts In Australia
Authors: Judge Paul Smith
Abstract:
Abstract—This paper addresses the use and admissibility of psychiatric/psychological evidence in Australia Courts. There have been different approaches in the Courts to the acceptance of such expert evidence. It details how such expert evidence is admissible at trial and sentence. The methodology used is an examination of the decided cases and relevant legislative provisions which relate to the admission of such evidence. The major findings are that the evidence can be admissible if it is relevant to issues in a trial or sentence. It concludes that psychiatric/psychological evidence can be very useful and indeed may be essential at sentence or trial.Keywords: criminal, law, psychological, evidence
Procedia PDF Downloads 533308 The Integration of Digital Humanities into the Sociology of Knowledge Approach to Discourse Analysis
Authors: Gertraud Koch, Teresa Stumpf, Alejandra Tijerina García
Abstract:
Discourse analysis research approaches belong to the central research strategies applied throughout the humanities; they focus on the countless forms and ways digital texts and images shape present-day notions of the world. Despite the constantly growing number of relevant digital, multimodal discourse resources, digital humanities (DH) methods are thus far not systematically developed and accessible for discourse analysis approaches. Specifically, the significance of multimodality and meaning plurality modelling are yet to be sufficiently addressed. In order to address this research gap, the D-WISE project aims to develop a prototypical working environment as digital support for the sociology of knowledge approach to discourse analysis and new IT-analysis approaches for the use of context-oriented embedding representations. Playing an essential role throughout our research endeavor is the constant optimization of hermeneutical methodology in the use of (semi)automated processes and their corresponding epistemological reflection. Among the discourse analyses, the sociology of knowledge approach to discourse analysis is characterised by the reconstructive and accompanying research into the formation of knowledge systems in social negotiation processes. The approach analyses how dominant understandings of a phenomenon develop, i.e., the way they are expressed and consolidated by various actors in specific arenas of discourse until a specific understanding of the phenomenon and its socially accepted structure are established. This article presents insights and initial findings from D-WISE, a joint research project running since 2021 between the Institute of Anthropological Studies in Culture and History and the Language Technology Group of the Department of Informatics at the University of Hamburg. As an interdisciplinary team, we develop central innovations with regard to the availability of relevant DH applications by building up a uniform working environment, which supports the procedure of the sociology of knowledge approach to discourse analysis within open corpora and heterogeneous, multimodal data sources for researchers in the humanities. We are hereby expanding the existing range of DH methods by developing contextualized embeddings for improved modelling of the plurality of meaning and the integrated processing of multimodal data. The alignment of this methodological and technical innovation is based on the epistemological working methods according to grounded theory as a hermeneutic methodology. In order to systematically relate, compare, and reflect the approaches of structural-IT and hermeneutic-interpretative analysis, the discourse analysis is carried out both manually and digitally. Using the example of current discourses on digitization in the healthcare sector and the associated issues regarding data protection, we have manually built an initial data corpus of which the relevant actors and discourse positions are analysed in conventional qualitative discourse analysis. At the same time, we are building an extensive digital corpus on the same topic based on the use and further development of entity-centered research tools such as topic crawlers and automated newsreaders. In addition to the text material, this consists of multimodal sources such as images, video sequences, and apps. In a blended reading process, the data material is filtered, annotated, and finally coded with the help of NLP tools such as dependency parsing, named entity recognition, co-reference resolution, entity linking, sentiment analysis, and other project-specific tools that are being adapted and developed. The coding process is carried out (semi-)automated by programs that propose coding paradigms based on the calculated entities and their relationships. Simultaneously, these can be specifically trained by manual coding in a closed reading process and specified according to the content issues. Overall, this approach enables purely qualitative, fully automated, and semi-automated analyses to be compared and reflected upon.Keywords: entanglement of structural IT and hermeneutic-interpretative analysis, multimodality, plurality of meaning, sociology of knowledge approach to discourse analysis
Procedia PDF Downloads 2263307 Hamilton-Jacobi Treatment of Damped Motion
Authors: Khaled I. Nawafleh
Abstract:
In this work, we apply the method of Hamilton-Jacobi to obtain solutions of Hamiltonian systems in classical mechanics with two certain structures: the first structure plays a central role in the theory of time-dependent Hamiltonians, whilst the second is used to treat classical Hamiltonians, including dissipation terms. It is proved that the generalization of problems from the calculus of variation methods in the nonstationary case can be obtained naturally in Hamilton-Jacobi formalism. Then, another expression of geometry of the Hamilton Jacobi equation is retrieved for Hamiltonians with time-dependent and frictional terms. Both approaches shall be applied to many physical examples.Keywords: Hamilton-Jacobi, time dependent lagrangians, dissipative systems, variational principle
Procedia PDF Downloads 179