Search results for: computational thinking
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3121

Search results for: computational thinking

961 Analysis of Genomics Big Data in Cloud Computing Using Fuzzy Logic

Authors: Mohammad Vahed, Ana Sadeghitohidi, Majid Vahed, Hiroki Takahashi

Abstract:

In the genomics field, the huge amounts of data have produced by the next-generation sequencers (NGS). Data volumes are very rapidly growing, as it is postulated that more than one billion bases will be produced per year in 2020. The growth rate of produced data is much faster than Moore's law in computer technology. This makes it more difficult to deal with genomics data, such as storing data, searching information, and finding the hidden information. It is required to develop the analysis platform for genomics big data. Cloud computing newly developed enables us to deal with big data more efficiently. Hadoop is one of the frameworks distributed computing and relies upon the core of a Big Data as a Service (BDaaS). Although many services have adopted this technology, e.g. amazon, there are a few applications in the biology field. Here, we propose a new algorithm to more efficiently deal with the genomics big data, e.g. sequencing data. Our algorithm consists of two parts: First is that BDaaS is applied for handling the data more efficiently. Second is that the hybrid method of MapReduce and Fuzzy logic is applied for data processing. This step can be parallelized in implementation. Our algorithm has great potential in computational analysis of genomics big data, e.g. de novo genome assembly and sequence similarity search. We will discuss our algorithm and its feasibility.

Keywords: big data, fuzzy logic, MapReduce, Hadoop, cloud computing

Procedia PDF Downloads 299
960 Formation of Human Resources in the Light of Sustainable Development and the Achievement of Full Employment

Authors: Kaddour Fellague Mohammed

Abstract:

The world has seen in recent years, significant developments affected various aspects of life and influenced the different types of institutions, thus was born a new world is a world of globalization, which dominated the scientific revolution and the tremendous technological developments, and that contributed to the re-formation of human resources in contemporary organizations, and made patterns new regulatory and at the same time raised and strongly values and new ideas, the organizations have become more flexible, and faster response to consumer and environmental conditions, and exceeded the problem of time and place in the framework of communication and human interaction and use of advanced information technology and adoption mainly mechanism in running its operations , focused on performance and based strategic thinking and approach in order to achieve its strategic goals high degrees of superiority and excellence, this new reality created an increasing need for a new type of human resources, quality aims to renew and aspire to be a strategic player in managing the organization and drafting of various strategies, think globally and act locally, to accommodate local variables in the international markets, which began organizations tend to strongly as well as the ability to work under different cultures. Human resources management of the most important management functions to focus on the human element, which is considered the most valuable resource of the Department and the most influential in productivity at all, that the management and development of human resources Tattabra a cornerstone in the majority of organizations which aims to strengthen the organizational capacity, and enable companies to attract and rehabilitation of the necessary competencies and are able to keep up with current and future challenges, human resources can contribute to and strongly in achieving the objectives and profit organization, and even expand more than contribute to the creation of new jobs to alleviate unemployment and achieve full operation, administration and human resources mean short optimal use of the human element is available and expected, where he was the efficiency and capabilities, and experience of this human element, and his enthusiasm for the work stop the efficiency and success in reaching their goals, so interested administration scientists developed the principles and foundations that help to make the most of each individual benefit in the organization through human resources management, these foundations start of the planning and selection, training and incentives and evaluation, which is not separate from each other, but are integrated with each other as a system systemic order to reach the efficient functioning of the human resources management and has been the organization as a whole in the context of development sustainable.

Keywords: configuration, training, development, human resources, operating

Procedia PDF Downloads 432
959 Investigating the Influence of Roof Fairing on Aerodynamic Drag of a Bluff Body

Authors: Kushal Kumar Chode

Abstract:

Increase in demand for fuel saving and demand for faster vehicles with decent fuel economy, researchers around the world started investigating in various passive flow control devices to improve the fuel efficiency of vehicles. In this paper, A roof fairing was investigated for reducing the aerodynamic drag of a bluff body. The bluff body considered for this work is Ahmed model with a rake angle of 25deg was and subjected to flow with a velocity of 40m/s having Reynolds number of 2.68million was analysed using a commercial Computational Fluid Dynamic (CFD) code Star CCM+. It was evident that pressure drag is the main source of drag on an Ahmed body from the initial study. Adding a roof fairing has delayed the flow separation and resulted in delaying wake formation, thus improving the pressure in near weak and reducing the wake region. Adding a roof fairing of height and length equal to 1/7H and 1/3L respectively has shown a drag reduction by 9%. However, an optimised fairing, which was obtained by changing height, length and width by 5% increase, recorded a drag reduction close 12%.

Keywords: Ahmed model, aerodynamic drag, passive flow control, roof fairing, wake formation

Procedia PDF Downloads 441
958 Characterization of Enterotoxigenic Escherichia coli CS6 Promoter

Authors: Mondal Indranil, Bhakat Debjyoti, Mukhopadayay Asish K., Chatterjee Nabendu S.

Abstract:

CS6 is the prevalent CF in our region and deciphering its molecular regulators would play a pivotal role in reducing the burden of ETEC pathogenesis. In prokaryotes, most of the genes are under the control of one operon and the promoter present upstream of the gene regulates the transcription of that gene. Here the promoter of CS6 was characterized by computational method and further analyzed by β-galactosidase assay and sequencing. Promoter constructs and deletions were prepared as required to analyze promoter activity. The effect of different additives on the CS6 promoter was analysed by the β-galactosidase assay. Bioinformatics analysis done by Softberry/BPROM predicted fur, lrp, and crp boxes, -10 and -35 region upstream of the CS6 gene. The promoter construction in no promoter plasmid pTL61T showed that region -573 to +1 is actually the promoter region as predicted. Sequential deletion of the region upstream of CS6 revealed that promoter activity remains the same when -573bp to -350bp is deleted. But after the deletion of the upstream region -350 bp to -255bp, promoter expression decreases drastically to 26%. Further deletion also decreases promoter activity up to a little range. So the region -355bp to -255bp holds the promoter sequence for the CS6 gene. Additives like iron, NaCl, etc., modulate promoter activity in a dose-dependent manner. From the promoter analysis, it can be said that the minimum region lies between -254 and +1. Important region(s) lies between -350 bp to -255 bp upstream in the promoter, which might have important elements needed to control CS6 gene expression.

Keywords: microbiology, promoter, colonization factor, ETEC

Procedia PDF Downloads 162
957 Supervisor Controller-Based Colored Petri Nets for Deadlock Control and Machine Failures in Automated Manufacturing Systems

Authors: Husam Kaid, Abdulrahman Al-Ahmari, Zhiwu Li

Abstract:

This paper develops a robust deadlock control technique for shared and unreliable resources in automated manufacturing systems (AMSs) based on structural analysis and colored Petri nets, which consists of three steps. The first step involves using strict minimal siphon control to create a live (deadlock-free) system that does not consider resource failure. The second step uses an approach based on colored Petri net, in which all monitors designed in the first step are merged into a single monitor. The third step addresses the deadlock control problems caused by resource failures. For all resource failures in the Petri net model a common recovery subnet based on colored petri net is proposed. The common recovery subnet is added to the obtained system at the second step to make the system reliable. The proposed approach is evaluated using an AMS from the literature. The results show that the proposed approach can be applied to an unreliable complex Petri net model, has a simpler structure and less computational complexity, and can obtain one common recovery subnet to model all resource failures.

Keywords: automated manufacturing system, colored Petri net, deadlocks, siphon

Procedia PDF Downloads 129
956 The Degree Project-Course in Swedish Teacher Education – Deliberative and Transformative Perspectives on the Formative Assessment Practice

Authors: Per Blomqvist

Abstract:

The overall aim of this study is to highlight how the degree project-course in teacher education has developed over time at Swedish universities, above all regarding changes in the formative assessment practices in relation to student's opportunities to take part in writing processes that can develop both their independent critical thinking, subject knowledge, and academic writing skills. Theoretically, the study is based on deliberative and transformative perspectives of teaching academic writing in higher education. The deliberative perspective is motivated by the fact that it is the universities and their departments' responsibility to give the students opportunities to develop their academic writing skills, while there is little guidance on how this can be implemented. The transformative perspective is motivated by the fact that education needs to be adapted to the student's prior knowledge and developed in relation to the student group. Given the academisation of education and the new student groups, this is a necessity. The empirical data consists of video recordings of teacher groups' conversations at three Swedish universities. The conversations were conducted as so-called collective remembering interviews, a method to stimulate the participants' memory through social interaction, and focused on addressing issues on how the degree project-course in teacher education has changed over time. Topic analysis was used to analyze the conversations in order to identify common descriptions and expressions among the teachers. The result highlights great similarities in how the degree project-course has changed over time, both from a deliberative and a transformative perspective. The course is characterized by a “strong framing,” where the teachers have great control over the work through detailed instructions for the writing process and detailed templates for the text. This is justified by the fact that the education has been adapted based on the student teachers' lack of prior subject knowledge. The strong framing places high demands on continuous discussions between teachers about, for example, which tools the students have with them and which linguistic and textual tools are offered in the education. The teachers describe that such governance often leads to conflicts between teachers from different departments because reading and writing are always part of cultural contexts and are linked to different knowledge, traditions, and values. The problem that is made visible in this study raises questions about how students' opportunities to develop independence and make critical judgments in academic writing are affected if the writing becomes too controlled and if passing students becomes the main goal of education.

Keywords: formative assessment, academic writing, degree project, higher education, deliberative perspective, transformative perspective

Procedia PDF Downloads 65
955 Computational Determination of the Magneto Electronic Properties of Ce₁₋ₓCuₓO₂ (x=12.5%): Emerging Material for Spintronic Devices

Authors: Aicha Bouhlala, Sabah Chettibi

Abstract:

Doping CeO₂ with transition metals is an effective way of tuning its properties. In the present work, we have performed self-consistent ab-initio calculation using the full-potential linearized augmented plane-wave method (FP-LAPW), based on the density functional theory (DFT) as implemented in the Wien2k simulation code to study the structural, electronic, and magnetic properties of the compound Ce₁₋ₓCuₓO₂ (x=12.5%) fluorite type oxide and to explore the effects of dopant Cu in ceria. The exchange correlation potential has been treated using the Perdew-Burke-Eenzerhof revised of solid (PBEsol). In structural properties, the equilibrium lattice constant is observed for the compound, which exists within the value of 5.382 A°. In electronic properties, the spin-polarized electronic bandstructure elucidates the semiconductor nature of the material in both spin channels, with the compound was observed to have a narrow bandgap on the spin-down configuration (0.162 EV) and bandgap on the spin-up (2.067 EV). Hence, the doped atom Cu plays a vital role in increasing the magnetic moments of the supercell, and the value of the total magnetic moment is found to be 2.99438 μB. Therefore, the compound Cu-doped CeO₂ shows a strong ferromagnetic behavior. The predicted results propose the compound could be a good candidate for spintronics applications.

Keywords: Cu-doped CeO₂, DFT, Wien2k, properties

Procedia PDF Downloads 255
954 Thinking Differently about Diversity: A Literature Review

Authors: Natalie Rinfret, Francine Tougas, Ann Beaton

Abstract:

Conventions No. 100 and 111 of the International Labor Organization, passed in 1951 and 1958 respectively, established the principles of equal pay for men and women for work of equal value and freedom from discrimination in employment. Governments of different countries followed suit. For example, in 1964, the Civil Rights Act was passed in the United States and in 1972, Canada ratified Convention 100. Thus, laws were enacted and programs were implemented to combat discrimination in the workplace and, over time, more than 90% of the member countries of the International Labour Organization have ratified these conventions by implementing programs such as employment equity in Canada aimed at groups recognized as being discriminated against in the labor market, including women. Although legislation has been in place for several decades, employment discrimination has not gone away. In this study, we pay particular attention to the hidden side of the effects of employment discrimination. This is the emergence of subtle forms of discrimination that often fly under the radar but nevertheless, have adverse effects on the attitudes and behaviors of members of targeted groups. Researchers have identified two forms of racial and gender bias. On the one hand, there are traditional prejudices referring to beliefs about the inferiority and innate differences of women and racial minorities compared to White men. They have the effect of confining these two groups to job categories suited to their perceived limited abilities and can result in degrading, if not violent and hateful, language and actions. On the other hand, more subtle prejudices are more suited to current social norms. However, this subtlety harbors a conflict between values of equality and remnants of negative beliefs and feelings toward women and racial minorities. Our literature review also takes into account an overlooked part of the groups targeted by the programs in place, senior workers, and highlights the quantifiable and observable effects of prejudice and discriminatory behaviors in employment. The study proposes a hybrid model of interventions, taking into account the organizational system (employment equity practices), discriminatory attitudes and behaviors, and the type of leadership to be advocated. This hybrid model includes, in the first instance, the implementation of initiatives aimed at both promoting employment equity and combating discrimination and, in the second instance, the establishment of practices that foster inclusion, the full and complete participation of all, including seniors, in the mission of their organization.

Keywords: employment discrimination, gender bias, the hybrid model of interventions, senior workers

Procedia PDF Downloads 220
953 An Interpretative Historical Analysis of Asylum and Refugee Policies and Attitudes to Australian Immigration Laws

Authors: Kamal Kithsiri Karunadasa Hewawasam Revulge

Abstract:

This paper is an interpretative historical analysis of Australian migration laws that examines asylum and refugee policies and attitudes in Australia. It looks at major turning points in Australian migration history, and in doing so, the researcher reviewed relevant literature on the aspects crucial to highlighting the current trend of Australian migration policies. The data was collected using secondary data from official government sources, including annual reports, media releases on immigration, inquiry reports, statistical information, and other available literature to identify critical historical events that significantly affected the systematic developments of asylum seekers and refugee policies in Australia and to look at the historical trends of official thinking. A reliance on using these official sources is justified as those are the most convincing sources to analyse the historical events in Australia. Additional literature provides us with critical analyses of the behaviour and culture of the Australian immigration administration. The analytical framework reviewed key Australian Government immigration policies since British colonization and the settlement era of 1787–the 1850s and to the present. The fundamental basis for doing so is that past events and incidents offer us clues and lessons relevant to the present day. Therefore, providing a perspective on migration history in Australia helps analyse how current policymakers' strategies developed and changed over time. Attention is also explicitly focused on Australian asylum and refugee policy internationally, as it helped to broaden the analysis. The finding proved a link between past events and adverse current Australian government policies towards asylum seekers and refugees. It highlighted that Australia's current migration policies are part of a carefully and deliberately planned pattern that arose from the occupation of Australia by early British settlers. In this context, the remarkable point is that the historical events of taking away children from their Australian indigenous parents, widely known as the 'stolen generation' reflected a model of assimilation, or a desire to absorb other cultures into Australian society by fully adopting the settlers' language, their culture, and losing indigenous people's traditions. Current Australian policies towards migrants reflect the same attitude. Hence, it could be argued that policies and attitudes towards asylum seekers and refugees, particularly so-called 'boat people' to some extent, still reflect Australia's earlier colonial and 'white Australia' history.

Keywords: migration law, refugee law, international law, administrative law

Procedia PDF Downloads 83
952 Bidirectional Long Short-Term Memory-Based Signal Detection for Orthogonal Frequency Division Multiplexing With All Index Modulation

Authors: Mahmut Yildirim

Abstract:

This paper proposed the bidirectional long short-term memory (Bi-LSTM) network-aided deep learning (DL)-based signal detection for Orthogonal frequency division multiplexing with all index modulation (OFDM-AIM), namely Bi-DeepAIM. OFDM-AIM is developed to increase the spectral efficiency of OFDM with index modulation (OFDM-IM), a promising multi-carrier technique for communication systems beyond 5G. In this paper, due to its strong classification ability, Bi-LSTM is considered an alternative to the maximum likelihood (ML) algorithm, which is used for signal detection in the classical OFDM-AIM scheme. The performance of the Bi-DeepAIM is compared with LSTM network-aided DL-based OFDM-AIM (DeepAIM) and classic OFDM-AIM that uses (ML)-based signal detection via BER performance and computational time criteria. Simulation results show that Bi-DeepAIM obtains better bit error rate (BER) performance than DeepAIM and lower computation time in signal detection than ML-AIM.

Keywords: bidirectional long short-term memory, deep learning, maximum likelihood, OFDM with all index modulation, signal detection

Procedia PDF Downloads 72
951 The Transformation of Architecture through the Technological Developments in History: Future Architecture Scenario

Authors: Adel Gurel, Ozge Ceylin Yildirim

Abstract:

Nowadays, design and architecture are being affected and underwent change with the rapid advancements in technology, economics, politics, society and culture. Architecture has been transforming with the latest developments after the inclusion of computers into design. Integration of design into the computational environment has revolutionized the architecture and new perspectives in architecture have been gained. The history of architecture shows the various technological developments and changes in which the architecture has transformed with time. Therefore, the analysis of integration between technology and the history of the architectural process makes it possible to build a consensus on the idea of how architecture is to proceed. In this study, each period that occurs with the integration of technology into architecture is addressed within historical process. At the same time, changes in architecture via technology are identified as important milestones and predictions with regards to the future of architecture have been determined. Developments and changes in technology and the use of technology in architecture within years are analyzed in charts and graphs comparatively. The historical process of architecture and its transformation via technology are supported with detailed literature review and they are consolidated with the examination of focal points of 20th-century architecture under the titles; parametric design, genetic architecture, simulation, and biomimicry. It is concluded that with the historical research between past and present; the developments in architecture cannot keep up with the advancements in technology and recent developments in technology overshadow the architecture, even the technology decides the direction of architecture. As a result, a scenario is presented with regards to the reach of technology in the future of architecture and the role of the architect.

Keywords: computer technologies, future architecture, scientific developments, transformation

Procedia PDF Downloads 191
950 Hybrid Algorithm for Non-Negative Matrix Factorization Based on Symmetric Kullback-Leibler Divergence for Signal Dependent Noise: A Case Study

Authors: Ana Serafimovic, Karthik Devarajan

Abstract:

Non-negative matrix factorization approximates a high dimensional non-negative matrix V as the product of two non-negative matrices, W and H, and allows only additive linear combinations of data, enabling it to learn parts with representations in reality. It has been successfully applied in the analysis and interpretation of high dimensional data arising in neuroscience, computational biology, and natural language processing, to name a few. The objective of this paper is to assess a hybrid algorithm for non-negative matrix factorization with multiplicative updates. The method aims to minimize the symmetric version of Kullback-Leibler divergence known as intrinsic information and assumes that the noise is signal-dependent and that it originates from an arbitrary distribution from the exponential family. It is a generalization of currently available algorithms for Gaussian, Poisson, gamma and inverse Gaussian noise. We demonstrate the potential usefulness of the new generalized algorithm by comparing its performance to the baseline methods which also aim to minimize symmetric divergence measures.

Keywords: non-negative matrix factorization, dimension reduction, clustering, intrinsic information, symmetric information divergence, signal-dependent noise, exponential family, generalized Kullback-Leibler divergence, dual divergence

Procedia PDF Downloads 246
949 Deciding Graph Non-Hamiltonicity via a Closure Algorithm

Authors: E. R. Swart, S. J. Gismondi, N. R. Swart, C. E. Bell

Abstract:

We present an heuristic algorithm that decides graph non-Hamiltonicity. All graphs are directed, each undirected edge regarded as a pair of counter directed arcs. Each of the n! Hamilton cycles in a complete graph on n+1 vertices is mapped to an n-permutation matrix P where p(u,i)=1 if and only if the ith arc in a cycle enters vertex u, starting and ending at vertex n+1. We first create exclusion set E by noting all arcs (u, v) not in G, sufficient to code precisely all cycles excluded from G i.e. cycles not in G use at least one arc not in G. Members are pairs of components of P, {p(u,i),p(v,i+1)}, i=1, n-1. A doubly stochastic-like relaxed LP formulation of the Hamilton cycle decision problem is constructed. Each {p(u,i),p(v,i+1)} in E is coded as variable q(u,i,v,i+1)=0 i.e. shrinks the feasible region. We then implement the Weak Closure Algorithm (WCA) that tests necessary conditions of a matching, together with Boolean closure to decide 0/1 variable assignments. Each {p(u,i),p(v,j)} not in E is tested for membership in E, and if possible, added to E (q(u,i,v,j)=0) to iteratively maximize |E|. If the WCA constructs E to be maximal, the set of all {p(u,i),p(v,j)}, then G is decided non-Hamiltonian. Only non-Hamiltonian G share this maximal property. Ten non-Hamiltonian graphs (10 through 104 vertices) and 2000 randomized 31 vertex non-Hamiltonian graphs are tested and correctly decided non-Hamiltonian. For Hamiltonian G, the complement of E covers a matching, perhaps useful in searching for cycles. We also present an example where the WCA fails.

Keywords: Hamilton cycle decision problem, computational complexity theory, graph theory, theoretical computer science

Procedia PDF Downloads 373
948 Constructing a Physics Guided Machine Learning Neural Network to Predict Tonal Noise Emitted by a Propeller

Authors: Arthur D. Wiedemann, Christopher Fuller, Kyle A. Pascioni

Abstract:

With the introduction of electric motors, small unmanned aerial vehicle designers have to consider trade-offs between acoustic noise and thrust generated. Currently, there are few low-computational tools available for predicting acoustic noise emitted by a propeller into the far-field. Artificial neural networks offer a highly non-linear and adaptive model for predicting isolated and interactive tonal noise. But neural networks require large data sets, exceeding practical considerations in modeling experimental results. A methodology known as physics guided machine learning has been applied in this study to reduce the required data set to train the network. After building and evaluating several neural networks, the best model is investigated to determine how the network successfully predicts the acoustic waveform. Lastly, a post-network transfer function is developed to remove discontinuity from the predicted waveform. Overall, methodologies from physics guided machine learning show a notable improvement in prediction performance, but additional loss functions are necessary for constructing predictive networks on small datasets.

Keywords: aeroacoustics, machine learning, propeller, rotor, neural network, physics guided machine learning

Procedia PDF Downloads 228
947 Amplifying Sine Unit-Convolutional Neural Network: An Efficient Deep Architecture for Image Classification and Feature Visualizations

Authors: Jamshaid Ul Rahman, Faiza Makhdoom, Dianchen Lu

Abstract:

Activation functions play a decisive role in determining the capacity of Deep Neural Networks (DNNs) as they enable neural networks to capture inherent nonlinearities present in data fed to them. The prior research on activation functions primarily focused on the utility of monotonic or non-oscillatory functions, until Growing Cosine Unit (GCU) broke the taboo for a number of applications. In this paper, a Convolutional Neural Network (CNN) model named as ASU-CNN is proposed which utilizes recently designed activation function ASU across its layers. The effect of this non-monotonic and oscillatory function is inspected through feature map visualizations from different convolutional layers. The optimization of proposed network is offered by Adam with a fine-tuned adjustment of learning rate. The network achieved promising results on both training and testing data for the classification of CIFAR-10. The experimental results affirm the computational feasibility and efficacy of the proposed model for performing tasks related to the field of computer vision.

Keywords: amplifying sine unit, activation function, convolutional neural networks, oscillatory activation, image classification, CIFAR-10

Procedia PDF Downloads 111
946 Understanding Strategic Engagement on the Conversation Table: Countering Terrorism in Nigeria

Authors: Anisah Ari

Abstract:

Effects of organized crime permeate all facets of life, including public health, socio-economic endeavors, and human security. If any element of this is affected, it impacts large-scale national and global interest. Seeking to address terrorist networks through technical thinking is like trying to kill a weed by just cutting off its branches. It will re-develop and expand in proportions beyond one’s imagination, even in horrific ways that threaten human security. The continent of Africa has been bedeviled by this menace, with little or no solution to the problem. Nigeria is dealing with a protracted insurgency that is perpetrated by a sect against any form of westernization. Reimagining approaches to dealing with pressing issues like terrorism may require engaging the right set of people in the conversation for any sustainable change. These are people who have lived through the daily effects of the violence that ensues from the activities of terrorist activities. Effective leadership is required for an inclusive process, where spaces are created for diverse voices to be heard, and multiple perspectives are listened to, and not just heard, that supports a determination of the realistic outcome. Addressing insurgency in Nigeria has experienced a lot of disinformation and uncertainty. This may be in part due to poor leadership or an iteration of technical solutions to adaptive challenge peacemaking efforts in Nigeria has focused on behaviors, attitudes and practices that contribute to violence. However, it is important to consider the underlying issues that build-up, ignite and fan the flames of violence—looking at conflict as a complex system, issues like climate change, low employment rates, corruption and the impunity of discrimination due to ethnicity and religion. This article will be looking at an option of the more relational way of addressing insurgency through adaptive approaches that embody engagement and solutions with the people rather than for the people. The construction of a local turn in peacebuilding is informed by the need to create a locally driven and sustained peace process that embodies the culture and practices of the people in enacting an everyday peace beyond just a perennial and universalist outlook. A critical analysis that explores the socially identified individuals and situations will be made, considering the more adaptive approach to a complex existential challenge rather than a universalist frame. Case Study and Ethnographic research approach to understand what other scholars have documented on the matter and also a first-hand understanding of the experiences and viewpoints of the participants.

Keywords: terrorism, adaptive, peace, culture

Procedia PDF Downloads 103
945 The Effects of Computer Game-Based Pedagogy on Graduate Students Statistics Performance

Authors: Clement Yeboah, Eva Laryea

Abstract:

A pretest-posttest within subjects experimental design was employed to examine the effects of a computerized basic statistics learning game on achievement and statistics-related anxiety of students enrolled in introductory graduate statistics course. Participants (N = 34) were graduate students in a variety of programs at state-funded research university in the Southeast United States. We analyzed pre-test posttest differences using paired samples t-tests for achievement and for statistics anxiety. The results of the t-test for knowledge in statistics were found to be statistically significant, indicating significant mean gains for statistical knowledge as a function of the game-based intervention. Likewise, the results of the t-test for statistics-related anxiety were also statistically significant, indicating a decrease in anxiety from pretest to posttest. The implications of the present study are significant for both teachers and students. For teachers, using computer games developed by the researchers can help to create a more dynamic and engaging classroom environment, as well as improve student learning outcomes. For students, playing these educational games can help to develop important skills such as problem solving, critical thinking, and collaboration. Students can develop an interest in the subject matter and spend quality time to learn the course as they play the game without knowing that they are even learning the presupposed hard course. The future directions of the present study are promising as technology continues to advance and become more widely available. Some potential future developments include the integration of virtual and augmented reality into educational games, the use of machine learning and artificial intelligence to create personalized learning experiences, and the development of new and innovative game-based assessment tools. It is also important to consider the ethical implications of computer game-based pedagogy, such as the potential for games to perpetuate harmful stereotypes and biases. As the field continues to evolve, it will be crucial to address these issues and work towards creating inclusive and equitable learning experiences for all students. This study has the potential to revolutionize the way basic statistics graduate students learn and offers exciting opportunities for future development and research. It is an important area of inquiry for educators, researchers, and policymakers and will continue to be a dynamic and rapidly evolving field for years to come.

Keywords: pretest-posttest within subjects, computer game-based learning, statistics achievement, statistics anxiety

Procedia PDF Downloads 77
944 Contact-Impact Analysis of Continuum Compliant Athletic Systems

Authors: Theddeus Tochukwu Akano, Omotayo Abayomi Fakinlede

Abstract:

Proper understanding of the behavior of compliant mechanisms use by athletes is important in order to avoid catastrophic failure. Such compliant mechanisms like the flex-run require the knowledge of their dynamic response and deformation behavior under quickly varying loads. The modeling of finite deformations of the compliant athletic system is described by Neo-Hookean model under contact-impact conditions. The dynamic impact-contact governing equations for both the target and impactor are derived based on the updated Lagrangian approach. A method where contactor and target are considered as a united body is applied in the formulation of the principle of virtual work for the bodies. In this paper, methods of continuum mechanics and nonlinear finite element method were deployed to develop a model that could capture the behavior of the compliant athletic system under quickly varying loads. A hybrid system of symbolic algebra (AceGEN) and a compiled back end (AceFEM) were employed, leveraging both ease of use and computational efficiency. The simulated results reveal the effect of the various contact-impact conditions on the deformation behavior of the impacting compliant mechanism.

Keywords: eigenvalue problems, finite element method, robin boundary condition, sturm-liouville problem

Procedia PDF Downloads 472
943 Real-Time Path Planning for Unmanned Air Vehicles Using Improved Rapidly-Exploring Random Tree and Iterative Trajectory Optimization

Authors: A. Ramalho, L. Romeiro, R. Ventura, A. Suleman

Abstract:

A real-time path planning framework for Unmanned Air Vehicles, and in particular multi-rotors is proposed. The framework is designed to provide feasible trajectories from the current UAV position to a goal state, taking into account constraints such as obstacle avoidance, problem kinematics, and vehicle limitations such as maximum speed and maximum acceleration. The framework computes feasible paths online, allowing to avoid new, unknown, dynamic obstacles without fully re-computing the trajectory. These features are achieved using an iterative process in which the robot computes and optimizes the trajectory while performing the mission objectives. A first trajectory is computed using a modified Rapidly-Exploring Random Tree (RRT) algorithm, that provides trajectories that respect a maximum curvature constraint. The trajectory optimization is accomplished using the Interior Point Optimizer (IPOPT) as a solver. The framework has proven to be able to compute a trajectory and optimize to a locally optimal with computational efficiency making it feasible for real-time operations.

Keywords: interior point optimization, multi-rotors, online path planning, rapidly exploring random trees, trajectory optimization

Procedia PDF Downloads 135
942 Evaluating Forecasting Strategies for Day-Ahead Electricity Prices: Insights From the Russia-Ukraine Crisis

Authors: Alexandra Papagianni, George Filis, Panagiotis Papadopoulos

Abstract:

The liberalization of the energy market and the increasing penetration of fluctuating renewables (e.g., wind and solar power) have heightened the importance of the spot market for ensuring efficient electricity supply. This is further emphasized by the EU’s goal of achieving net-zero emissions by 2050. The day-ahead market (DAM) plays a key role in European energy trading, accounting for 80-90% of spot transactions and providing critical insights for next-day pricing. Therefore, short-term electricity price forecasting (EPF) within the DAM is crucial for market participants to make informed decisions and improve their market positioning. Existing literature highlights out-of-sample performance as a key factor in assessing EPF accuracy, with influencing factors such as predictors, forecast horizon, model selection, and strategy. Several studies indicate that electricity demand is a primary price determinant, while renewable energy sources (RES) like wind and solar significantly impact price dynamics, often lowering prices. Additionally, incorporating data from neighboring countries, due to market coupling, further improves forecast accuracy. Most studies predict up to 24 steps ahead using hourly data, while some extend forecasts using higher-frequency data (e.g., half-hourly or quarter-hourly). Short-term EPF methods fall into two main categories: statistical and computational intelligence (CI) methods, with hybrid models combining both. While many studies use advanced statistical methods, particularly through different versions of traditional AR-type models, others apply computational techniques such as artificial neural networks (ANNs) and support vector machines (SVMs). Recent research combines multiple methods to enhance forecasting performance. Despite extensive research on EPF accuracy, a gap remains in understanding how forecasting strategy affects prediction outcomes. While iterated strategies are commonly used, they are often chosen without justification. This paper contributes by examining whether the choice of forecasting strategy impacts the quality of day-ahead price predictions, especially for multi-step forecasts. We evaluate both iterated and direct methods, exploring alternative ways of conducting iterated forecasts on benchmark and state-of-the-art forecasting frameworks. The goal is to assess whether these factors should be considered by end-users to improve forecast quality. We focus on the Greek DAM using data from July 1, 2021, to March 31, 2022. This period is chosen due to significant price volatility in Greece, driven by its dependence on natural gas and limited interconnection capacity with larger European grids. The analysis covers two phases: pre-conflict (January 1, 2022, to February 23, 2022) and post-conflict (February 24, 2022, to March 31, 2022), following the Russian-Ukraine conflict that initiated an energy crisis. We use the mean absolute percentage error (MAPE) and symmetric mean absolute percentage error (sMAPE) for evaluation, as well as the Direction of Change (DoC) measure to assess the accuracy of price movement predictions. Our findings suggest that forecasters need to apply all strategies across different horizons and models. Different strategies may be required for different horizons to optimize both accuracy and directional predictions, ensuring more reliable forecasts.

Keywords: short-term electricity price forecast, forecast strategies, forecast horizons, recursive strategy, direct strategy

Procedia PDF Downloads 7
941 The Principle of a Thought Formation: The Biological Base for a Thought

Authors: Ludmila Vucolova

Abstract:

The thought is a process that underlies consciousness and cognition and understanding its origin and processes is a longstanding goal of many academic disciplines. By integrating over twenty novel ideas and hypotheses of this theoretical proposal, we can speculate that thought is an emergent property of coded neural events, translating the electro-chemical interactions of the body with its environment—the objects of sensory stimulation, X, and Y. The latter is a self- generated feedback entity, resulting from the arbitrary pattern of the motion of a body’s motor repertory (M). A culmination of these neural events gives rise to a thought: a state of identity between an observed object X and a symbol Y. It manifests as a “state of awareness” or “state of knowing” and forms our perception of the physical world. The values of the variables of a construct—X (object), S1 (sense for the perception of X), Y (object), S2 (sense for perception of Y), and M (motor repertory that produces Y)—will specify the particular conscious percept at any given time. The proposed principle of interaction between the elements of a construct (X, Y, S1, S2, M) is universal and applies for all modes of communication (normal, deaf, blind, deaf and blind people) and for various language systems (Chinese, Italian, English, etc.). The particular arrangement of modalities of each of the three modules S1 (5 of 5), S2 (1 of 3), and M (3 of 3) defines a specific mode of communication. This multifaceted paradigm demonstrates a predetermined pattern of relationships between X, Y, and M that passes from generation to generation. The presented analysis of a cognitive experience encompasses the key elements of embodied cognition theories and unequivocally accords with the scientific interpretation of cognition as the mental action or process of acquiring knowledge and understanding through thought, experience, and the senses, and cognition means thinking and awareness. By assembling the novel ideas presented in twelve sections, we can reveal that in the invisible “chaos”, there is an order, a structure with landmarks and principles of operations and mental processes (thoughts) are physical and have a biological basis. This innovative proposal explains the phenomenon of mental imagery; give the first insight into the relationship between mental states and brain states, and support the notion that mind and body are inseparably connected. The findings of this theoretical proposal are supported by the current scientific data and are substantiated by the records of the evolution of language and human intelligence.

Keywords: agent, awareness, cognitive, element, experience, feedback, first person, imagery, language, mental, motor, object, sensory, symbol, thought

Procedia PDF Downloads 384
940 Crow Search Algorithm-Based Task Offloading Strategies for Fog Computing Architectures

Authors: Aniket Ganvir, Ritarani Sahu, Suchismita Chinara

Abstract:

The rapid digitization of various aspects of life is leading to the creation of smart IoT ecosystems, where interconnected devices generate significant amounts of valuable data. However, these IoT devices face constraints such as limited computational resources and bandwidth. Cloud computing emerges as a solution by offering ample resources for offloading tasks efficiently despite introducing latency issues, especially for time-sensitive applications like fog computing. Fog computing (FC) addresses latency concerns by bringing computation and storage closer to the network edge, minimizing data travel distance, and enhancing efficiency. Offloading tasks to fog nodes or the cloud can conserve energy and extend IoT device lifespan. The offloading process is intricate, with tasks categorized as full or partial, and its optimization presents an NP-hard problem. Traditional greedy search methods struggle to address the complexity of task offloading efficiently. To overcome this, the efficient crow search algorithm (ECSA) has been proposed as a meta-heuristic optimization algorithm. ECSA aims to effectively optimize computation offloading, providing solutions to this challenging problem.

Keywords: IoT, fog computing, task offloading, efficient crow search algorithm

Procedia PDF Downloads 58
939 Hydrodynamic Simulation of Co-Current and Counter Current of Column Distillation Using Euler Lagrange Approach

Authors: H. Troudi, M. Ghiss, Z. Tourki, M. Ellejmi

Abstract:

Packed columns of liquefied petroleum gas (LPG) consists of separating the liquid mixture of propane and butane to pure gas components by the distillation phenomenon. The flow of the gas and liquid inside the columns is operated by two ways: The co-current and the counter current operation. Heat, mass and species transfer between phases represent the most important factors that influence the choice between those two operations. In this paper, both processes are discussed using computational CFD simulation through ANSYS-Fluent software. Only 3D half section of the packed column was considered with one packed bed. The packed bed was characterized in our case as a porous media. The simulations were carried out at transient state conditions. A multi-component gas and liquid mixture were used out in the two processes. We utilized the Euler-Lagrange approach in which the gas was treated as a continuum phase and the liquid as a group of dispersed particles. The heat and the mass transfer process was modeled using multi-component droplet evaporation approach. The results show that the counter-current process performs better than the co-current, although such limitations of our approach are noted. This comparison gives accurate results for computations times higher than 2 s, at different gas velocity and at packed bed porosity of 0.9.

Keywords: co-current, counter-current, Euler-Lagrange model, heat transfer, mass transfer

Procedia PDF Downloads 212
938 Effects of the Quality Construction of Public Construction in Taiwan to Implementation Three Levels Quality Management Institution

Authors: Hsin-Hung Lai, Wei Lo

Abstract:

Whether it is in virtue or vice for a construction quality of public construction project, it is one of the important indicators for national economic development and overall construction, the impact on the quality of national life is very deep. In recent years, a number of scandal of public construction project occurred, the requirements of the government agencies and the public require the quality of construction of public construction project are getting stricter than ever, the three-level public construction project construction quality of quality control system implemented by the government has a profound impact. This study mainly aggregated the evolution of ISO 9000 quality control system, the difference between the practice of implementing management of construction quality by many countries and three-level quality control of our country, so we explored and found that almost all projects of enhancing construction quality are dominated by civil organizations in foreign countries, whereas, it is induced by the national power in our country and develop our three-level quality control system and audit mechanism based on IOS system and implement the works by legislation, we also explored its enhancement and relevance with construction quality of public construction project that are intervened by such system and national power, and it really presents the effectiveness of construction quality been enhanced by the audited result. The three-level quality control system of our country to promote the policy of public construction project is almost same with the quality control system of many developed countries; however our country mainly implements such system on public construction project only, we promote the three-level quality control system is for enhancing the quality of public construction project, for establishing effective quality management system, so as to urge, correct and prevent the defects of quality management by manufacturers, whereas, those developed countries is comprehensively promoting (both public construction project and civil construction) such system. Therefore, this study is to explore the scope for public construction project only; the most important is the quality recognition by the executor, either good quality or deterioration is not a single event, there is a certain procedure extends from the demand and feasibility analysis, design, tendering, contracting, construction performance, inspection, continuous improvement, completion and acceptance, transferring and meeting the needs of the users, all of mentioned above have a causal relationship and it is a systemic problems. So the best construction quality would be manufactured and managed by reasonable cost if it is by extensive thinking and be preventive. We aggregated the implemented results in the past 10 years (2005 to 2015), the audited results of both in central units and local ones were slightly increased in A-grade while those listed in B-grade were decreased, although the levels were not evidently upgraded, yet, such result presents that the construction quality of concept of manufacturers are improving, and the construction quality has been established in the design stage, thus it is relatively beneficial to the enhancement of construction quality of overall public construction project.

Keywords: ISO 9000, three-level quality control system, audit and review mechanism for construction implementation, quality of construction implementation

Procedia PDF Downloads 346
937 Designing Presentational Writing Assessments for the Advanced Placement World Language and Culture Exams

Authors: Mette Pedersen

Abstract:

This paper outlines the criteria that assessment specialists use when they design the 'Persuasive Essay' task for the four Advanced Placement World Language and Culture Exams (AP French, German, Italian, and Spanish). The 'Persuasive Essay' is a free-response, source-based, standardized measure of presentational writing. Each 'Persuasive Essay' item consists of three sources (an article, a chart, and an audio) and a prompt, which is a statement of the topic phrased as an interrogative sentence. Due to its richness of source materials and due to the amount of time that test takers are given to prepare for and write their responses (a total of 55 minutes), the 'Persuasive Essay' is the free-response task on the AP World Language and Culture Exams that goes to the greatest lengths to unleash the test takers' proficiency potential. The author focuses on the work that goes into designing the 'Persuasive Essay' task, outlining best practices for the selection of topics and sources, the interplay that needs to be present among the sources and the thinking behind the articulation of prompts for the 'Persuasive Essay' task. Using released 'Persuasive Essay' items from the AP World Language and Culture Exams and accompanying data on test taker performance, the author shows how different passages, and features of passages, have succeeded (and sometimes not succeeded) in eliciting writing proficiency among test takers over time. Data from approximately 215.000 test takers per year from 2014 to 2017 and approximately 35.000 test takers per year from 2012 to 2013 form the basis of this analysis. The conclusion of the study is that test taker performance improves significantly when the sources that test takers are presented with express directly opposing viewpoints. Test taker performance also improves when the interrogative prompt that the test takers respond to is phrased as a yes/no question. Finally, an analysis of linguistic difficulty and complexity levels of the printed sources reveals that test taker performance does not decrease when the complexity level of the article of the 'Persuasive Essay' increases. This last text complexity analysis is performed with the help of the 'ETS TextEvaluator' tool and the 'Complexity Scale for Information Texts (Scale)', two tools, which, in combination, provide a rubric and a fully-automated technology for evaluating nonfiction and informational texts in English translation.

Keywords: advanced placement world language and culture exams, designing presentational writing assessments, large-scale standardized assessments of written language proficiency, source-based language testing

Procedia PDF Downloads 144
936 Evaluating the Effect of Structural Reorientation to Thermochemical and Energetic Properties of 1,4-Diamino-3,6-Dinitropyrazolo[4,3- C]Pyrazole

Authors: Lamla Thungathaa, Conrad Mahlasea, Lisa Ngcebesha

Abstract:

1,4-Diamino-3,6-dinitropyrazolo[4,3-c]pyrazole (LLM-119) and its structural isomer 3,6-dinitropyrazolo[3,4-c]pyrazole-1,4(6H)-diamine were designed by structural reorientation of the fused pyrazole rings and their respective substituents (-NO2 and -NH2). Structural reorientation involves structural rearrangement which result in different structural isomers, employing this approach, six structural isomers of LLM-119 were achieved. The effect of structural reorientation (isomerisation and derivatives) on the enthalpy of formation, detonation properties, impact sensitivity, and density of these molecules is studied Computationally. The computational method used are detailed in the document and they yielded results that are close to the literature values with a relative error of 2% for enthalpy of formation, 2% for density, 0.05% for detonation velocity, and 4% for detonation pressure. The correlation of the structural reorientation to the calculated thermochemical and detonation properties of the molecules indicated that molecules with a -NO2 group attached to a Carbon atom and -NH2 connected to a Nitrogen atom maximize the enthalpy of formation and detonation velocity. The joining of pyrazole molecules has less effect on these parameters. It was seen that density and detonation pressure improved when both –NO2 or -NH2 functional groups were on the same side of the molecular structure. The structural reorientation gave rise to 3,4-dinitropyrazolo[3,4-c]pyrazole-1,6-diamine which exhibited optimal density and detonation performance compared to other molecules.

Keywords: LLM-119, fused rings, azole, structural isomers, detonation properties

Procedia PDF Downloads 92
935 An Insight into the Conformational Dynamics of Glycan through Molecular Dynamics Simulation

Authors: K. Veluraja

Abstract:

Glycan of glycolipids and glycoproteins is playing a significant role in living systems particularly in molecular recognition processes. Molecular recognition processes are attributed to their occurrence on the surface of the cell, sequential arrangement and type of sugar molecules present in the oligosaccharide structure and glyosidic linkage diversity (glycoinformatics) and conformational diversity (glycoconformatics). Molecular Dynamics Simulation study is a theoretical-cum-computational tool successfully utilized to establish glycoconformatics of glycan. The study on various oligosaccharides of glycan clearly indicates that oligosaccharides do exist in multiple conformational states and these conformational states arise due to the flexibility associated with a glycosidic torsional angle (φ,ψ) . As an example: a single disaccharide structure NeuNacα(2-3) Gal exists in three different conformational states due to the differences in the preferential value of glycosidic torsional angles (φ,ψ). Hence establishing three dimensional structural and conformational models for glycan (cartesian coordinates of every individual atoms of an oligosaccharide structure in a preferred conformation) is quite crucial to understand various molecular recognition processes such as glycan-toxin interaction and glycan-virus interaction. The gycoconformatics models obtained for various glycan through Molecular Dynamics Simulation stored in our 3DSDSCAR (3DSDSCAR.ORG) a public domain database and its utility value in understanding the molecular recognition processes and in drug design venture will be discussed.

Keywords: glycan, glycoconformatics, molecular dynamics simulation, oligosaccharide

Procedia PDF Downloads 137
934 An Entropy Stable Three Dimensional Ideal MHD Solver with Guaranteed Positive Pressure

Authors: Andrew R. Winters, Gregor J. Gassner

Abstract:

A high-order numerical magentohydrodynamics (MHD) solver built upon a non-linear entropy stable numerical flux function that supports eight traveling wave solutions will be described. The method is designed to treat the divergence-free constraint on the magnetic field in a similar fashion to a hyperbolic divergence cleaning technique. The solver is especially well-suited for flows involving strong discontinuities due to its strong stability without the need to enforce artificial low density or energy limits. Furthermore, a new formulation of the numerical algorithm to guarantee positivity of the pressure during the simulation is described and presented. By construction, the solver conserves mass, momentum, and energy and is entropy stable. High spatial order is obtained through the use of a third order limiting technique. High temporal order is achieved by utilizing the family of strong stability preserving (SSP) Runge-Kutta methods. Main attributes of the solver are presented as well as details on an implementation of the new solver into the multi-physics, multi-scale simulation code FLASH. The accuracy, robustness, and computational efficiency is demonstrated with a variety of numerical tests. Comparisons are also made between the new solver and existing methods already present in FLASH framework.

Keywords: entropy stability, finite volume scheme, magnetohydrodynamics, pressure positivity

Procedia PDF Downloads 343
933 Continuous Differential Evolution Based Parameter Estimation Framework for Signal Models

Authors: Ammara Mehmood, Aneela Zameer, Muhammad Asif Zahoor Raja, Muhammad Faisal Fateh

Abstract:

In this work, the strength of bio-inspired computational intelligence based technique is exploited for parameter estimation for the periodic signals using Continuous Differential Evolution (CDE) by defining an error function in the mean square sense. Multidimensional and nonlinear nature of the problem emerging in sinusoidal signal models along with noise makes it a challenging optimization task, which is dealt with robustness and effectiveness of CDE to ensure convergence and avoid trapping in local minima. In the proposed scheme of Continuous Differential Evolution based Signal Parameter Estimation (CDESPE), unknown adjustable weights of the signal system identification model are optimized utilizing CDE algorithm. The performance of CDESPE model is validated through statistics based various performance indices on a sufficiently large number of runs in terms of estimation error, mean squared error and Thiel’s inequality coefficient. Efficacy of CDESPE is examined by comparison with the actual parameters of the system, Genetic Algorithm based outcomes and from various deterministic approaches at different signal-to-noise ratio (SNR) levels.

Keywords: parameter estimation, bio-inspired computing, continuous differential evolution (CDE), periodic signals

Procedia PDF Downloads 302
932 Computational Analysis and Daily Application of the Key Neurotransmitters Involved in Happiness: Dopamine, Oxytocin, Serotonin, and Endorphins

Authors: Hee Soo Kim, Ha Young Kyung

Abstract:

Happiness and pleasure are a result of dopamine, oxytocin, serotonin, and endorphin levels in the body. In order to increase the four neurochemical levels, it is important to associate daily activities with its corresponding neurochemical releases. This includes setting goals, maintaining social relationships, laughing frequently, and exercising regularly. The likelihood of experiencing happiness increases when all four neurochemicals are released at the optimal level. The achievement of happiness is important because it increases healthiness, productivity, and the ability to overcome adversity. To process emotions, electrical brain waves, brain structure, and neurochemicals must be analyzed. This research uses Chemcraft and Avogadro to determine the theoretical and chemical properties of the four neurochemical molecules. Each neurochemical molecule’s thermodynamic stability is calculated to observe the efficiency of the molecules. The study found that among dopamine, oxytocin, serotonin, alpha-, beta-, and gamma-endorphin, beta-endorphin has the lowest optimized energy of 388.510 kJ/mol. Beta-endorphin, a neurotransmitter involved in mitigating pain and stress, is the most thermodynamically stable and efficient molecule that is involved in the process of happiness. Through examining such properties of happiness neurotransmitters, the science of happiness is better understood.

Keywords: happiness, neurotransmitters, positive psychology, dopamine, oxytocin, serotonin, endorphins

Procedia PDF Downloads 154