Search results for: atomistic toolKit
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 115

Search results for: atomistic toolKit

85 Coding Considerations for Standalone Molecular Dynamics Simulations of Atomistic Structures

Authors: R. O. Ocaya, J. J. Terblans

Abstract:

The laws of Newtonian mechanics allow ab-initio molecular dynamics to model and simulate particle trajectories in material science by defining a differentiable potential function. This paper discusses some considerations for the coding of ab-initio programs for simulation on a standalone computer and illustrates the approach by C language codes in the context of embedded metallic atoms in the face-centred cubic structure. The algorithms use velocity-time integration to determine particle parameter evolution for up to several thousands of particles in a thermodynamical ensemble. Such functions are reusable and can be placed in a redistributable header library file. While there are both commercial and free packages available, their heuristic nature prevents dissection. In addition, developing own codes has the obvious advantage of teaching techniques applicable to new problems.

Keywords: C language, molecular dynamics, simulation, embedded atom method

Procedia PDF Downloads 275
84 Effect of Hydroxyl Functionalization on the Mechanical and Fracture Behaviour of Monolayer Graphene

Authors: Akarsh Verma, Avinash Parashar

Abstract:

The aim of this article is to study the effects of hydroxyl functional group on the mechanical strength and fracture toughness of graphene. This functional group forms the backbone of intrinsic atomic structure of graphene oxide (GO). Molecular dynamics-based simulations were performed in conjunction with reactive force field (ReaxFF) parameters to capture the mode-I fracture toughness of hydroxyl functionalised graphene. Moreover, these simulations helped in concluding that spatial distribution and concentration of hydroxyl functional group significantly affects the fracture morphology of graphene nanosheet. In contrast to literature investigations, atomistic simulations predicted a transition in the failure morphology of hydroxyl functionalised graphene from brittle to ductile as a function of its spatial distribution on graphene sheet.

Keywords: graphene, graphene oxide, ReaxFF, molecular dynamics

Procedia PDF Downloads 150
83 Framework to Organize Community-Led Project-Based Learning at a Massive Scale of 900 Indian Villages

Authors: Ayesha Selwyn, Annapoorni Chandrashekar, Kumar Ashwarya, Nishant Baghel

Abstract:

Project-based learning (PBL) activities are typically implemented in technology-enabled schools by highly trained teachers. In rural India, students have limited access to technology and quality education. Implementing typical PBL activities is challenging. This study details how Pratham Education Foundation’s Hybrid Learning model was used to implement two PBL activities related to music in 900 remote Indian villages with 46,000 students aged 10-14. The activities were completed by 69% of groups that submitted a total of 15,000 videos (completed projects). Pratham’s H-Learning model reaches 100,000 students aged 3-14 in 900 Indian villages. The community-driven model engages students in 20,000 self-organized groups outside of school. The students are guided by 6,000 youth volunteers and 100 facilitators. The students partake in learning activities across subjects with the support of community stakeholders and offline digital content on shared Android tablets. A training and implementation toolkit for PBL activities is designed by subject experts. This toolkit is essential in ensuring efficient implementation of activities as facilitators aren’t highly skilled and have limited access to training resources. The toolkit details the activity at three levels of student engagement - enrollment, participation, and completion. The subject experts train project leaders and facilitators who train youth volunteers. Volunteers need to be trained on how to execute the activity and guide students. The training is focused on building the volunteers’ capacity to enable students to solve problems, rather than developing the volunteers’ subject-related knowledge. This structure ensures that continuous intervention of subject matter experts isn’t required, and the onus of judging creativity skills is put on community members. 46,000 students in the H-Learning program were engaged in two PBL activities related to Music from April-June 2019. For one activity, students had to conduct a “musical survey” in their village by designing a survey and shooting and editing a video. This activity aimed to develop students’ information retrieval, data gathering, teamwork, communication, project management, and creativity skills. It also aimed to identify talent and document local folk music. The second activity, “Pratham Idol”, was a singing competition. Students participated in performing, producing, and editing videos. This activity aimed to develop students’ teamwork and creative skills and give students a creative outlet. Students showcased their completed projects at village fairs wherein a panel of community members evaluated the videos. The shortlisted videos from all villages were further evaluated by experts who identified students and adults to participate in advanced music workshops. The H-Learning framework enables students in low resource settings to engage in PBL and develop relevant skills by leveraging community support and using video creation as a tool. In rural India, students do not have access to high-quality education or infrastructure. Therefore designing activities that can be implemented by community members after limited training is essential. The subject experts have minimal intervention once the activity is initiated, which significantly reduces the cost of implementation and allows the activity to be implemented at a massive scale.

Keywords: community supported learning, project-based learning, self-organized learning, education technology

Procedia PDF Downloads 151
82 Molecular Dynamics Simulation of Realistic Biochar Models with Controlled Microporosity

Authors: Audrey Ngambia, Ondrej Masek, Valentina Erastova

Abstract:

Biochar is an amorphous carbon-rich material generated from the pyrolysis of biomass with multifarious properties and functionality. Biochar has shown proven applications in the treatment of flue gas and organic and inorganic pollutants in soil and water/wastewater as a result of its multiple surface functional groups and porous structures. These properties have also shown potential in energy storage and carbon capture. The availability of diverse sources of biomass to produce biochar has increased interest in it as a sustainable and environmentally friendly material. The properties and porous structures of biochar vary depending on the type of biomass and high heat treatment temperature (HHT). Biochars produced at HHT between 400°C – 800°C generally have lower H/C and O/C ratios, higher porosities, larger pore sizes and higher surface areas with temperature. While all is known experimentally, there is little knowledge on the porous role structure and functional groups play on processes occurring at the atomistic scale, which are extremely important for the optimization of biochar for application, especially in the adsorption of gases. Atomistic simulations methods have shown the potential to generate such amorphous materials; however, most of the models available are composed of only carbon atoms or graphitic sheets, which are very dense or with simple slit pores, all of which ignore the important role of heteroatoms such as O, N, S and pore morphologies. Hence, developing realistic models that integrate these parameters are important to understand their role in governing adsorption mechanisms that will aid in guiding the design and optimization of biochar materials for target applications. In this work, molecular dynamics simulations in the isobaric ensemble are used to generate realistic biochar models taking into account experimentally determined H/C, O/C, N/C, aromaticity, micropore size range, micropore volumes and true densities of biochars. A pore generation approach was developed using virtual atoms, which is a Lennard-Jones sphere of varying van der Waals radius and softness. Its interaction via a soft-core potential with the biochar matrix allows the creation of pores with rough surfaces while varying the van der Waals radius parameters gives control to the pore-size distribution. We focused on microporosity, creating average pore sizes of 0.5 - 2 nm in diameter and pore volumes in the range of 0.05 – 1 cm3/g, which corresponds to experimental gas adsorption micropore sizes of amorphous porous biochars. Realistic biochar models with surface functionalities, micropore size distribution and pore morphologies were developed, and they could aid in the study of adsorption processes in confined micropores.

Keywords: biochar, heteroatoms, micropore size, molecular dynamics simulations, surface functional groups, virtual atoms

Procedia PDF Downloads 41
81 The Combination of the Mel Frequency Cepstral Coefficients, Perceptual Linear Prediction, Jitter and Shimmer Coefficients for the Improvement of Automatic Recognition System for Dysarthric Speech

Authors: Brahim Fares Zaidi

Abstract:

Our work aims to improve our Automatic Recognition System for Dysarthria Speech based on the Hidden Models of Markov and the Hidden Markov Model Toolkit to help people who are sick. With pronunciation problems, we applied two techniques of speech parameterization based on Mel Frequency Cepstral Coefficients and Perceptual Linear Prediction and concatenated them with JITTER and SHIMMER coefficients in order to increase the recognition rate of a dysarthria speech. For our tests, we used the NEMOURS database that represents speakers with dysarthria and normal speakers.

Keywords: ARSDS, HTK, HMM, MFCC, PLP

Procedia PDF Downloads 76
80 Structure of Grain Boundaries in α-Zirconium and Niobium

Authors: Divya Singh, Avinash Parashar

Abstract:

Due to superior mechanical, creep and nuclear cross section, zirconium and niobium (Zr-Nb) based alloys are commonly used as nuclear materials for the manufacturing of fuel cladding and pressure tubes in nuclear power plants. In this work, symmetrical tilt grain boundary (STGB) structures in α-Zr are studied for their structure and energies along two tilt axes- [0001] and [0-110] using MD based simulations. Tilt grain boundaries are obtained along [0001] tilt axis, and special twin structures are obtained along [0-110] tilt axis in α-Zr. For Nb, STGBs are constructed along [100] and [110] axis using atomistic simulations. The correlation between GB structures and their energies is subsequently examined. A close relationship is found to exist between individual GB structure and its energy in both α-Zr and Nb. It is also concluded that the energies of the more coherent twin grain boundaries are lower than the symmetrical tilt grain boundaries.

Keywords: grain boundaries, molecular dynamics, grain boundary energy, hcp crystal

Procedia PDF Downloads 231
79 Visualization-Based Feature Extraction for Classification in Real-Time Interaction

Authors: Ágoston Nagy

Abstract:

This paper introduces a method of using unsupervised machine learning to visualize the feature space of a dataset in 2D, in order to find most characteristic segments in the set. After dimension reduction, users can select clusters by manual drawing. Selected clusters are recorded into a data model that is used for later predictions, based on realtime data. Predictions are made with supervised learning, using Gesture Recognition Toolkit. The paper introduces two example applications: a semantic audio organizer for analyzing incoming sounds, and a gesture database organizer where gestural data (recorded by a Leap motion) is visualized for further manipulation.

Keywords: gesture recognition, machine learning, real-time interaction, visualization

Procedia PDF Downloads 323
78 Nudge Plus: Incorporating Reflection into Behavioural Public Policy

Authors: Sanchayan Banerjee, Peter John

Abstract:

Nudge plus is a modification of the toolkit of behavioural public policy. It incorporates an element of reflection¾the plus¾into the delivery of a nudge, either blended in or made proximate. Nudge plus builds on recent work combining heuristics and deliberation. It may be used to design pro-social interventions that help preserve the autonomy of the agent. The argument turns on seminal work on dual systems, which presents a subtler relationship between fast and slow thinking than commonly assumed in the classic literature in behavioural public policy. We review classic and recent work on dual processes to show that a hybrid is more plausible than the default interventionist or parallel competitive framework. We define nudge plus, set out what reflection could entail, provide examples, outline causal mechanisms, and draw testable implications.

Keywords: nudge, nudge plus, think, dual process theory

Procedia PDF Downloads 157
77 Expert and Novice Problem-Solvers Differences: A Discourse for Effective Teaching Delivery in Physics Classrooms

Authors: Abubakar Sa’adatu Mohammed

Abstract:

This paper reports on a study of problem solving differences between expert and novice Problem solvers for effective physics teaching. Significant differences were found both at the conceptual level and at the level of critical thinking, creative thinking and reasoning. It is suggested for a successful solution of a problem, conceptual knowledge alone may not be sufficient. There is the need of the knowledge of how the conceptual knowledge should be applied (problem solving skills). It is hoped that this research might contribute to efforts of exploring ways for students to acquire a powerful conceptual toolkit based on experts like problem solvers approach for effective teaching delivery.

Keywords: conceptual knowledge, procedural knowledge, critical thinking, creative thinking, reasoning ability

Procedia PDF Downloads 268
76 Multi-Level Priority Based Task Scheduling Algorithm for Workflows in Cloud Environment

Authors: Anju Bala, Inderveer Chana

Abstract:

Task scheduling is the key concern for the execution of performance-driven workflow applications. As efficient scheduling can have major impact on the performance of the system, task scheduling is often chosen for assigning the request to resources in an efficient way based on cloud resource characteristics. In this paper, priority based task scheduling algorithm has been proposed that prioritizes the tasks based on the length of the instructions. The proposed scheduling approach prioritize the tasks of Cloud applications according to the limits set by six sigma control charts based on dynamic threshold values. Further, the proposed algorithm has been validated through the CloudSim toolkit. The experimental results demonstrate that the proposed algorithm is effective for handling multiple task lists from workflows and in considerably reducing Makespan and Execution time.

Keywords: cloud computing, priority based scheduling, task scheduling, VM allocation

Procedia PDF Downloads 486
75 [Keynote Speech]: Simulation Studies of Pulsed Voltage Effects on Cells

Authors: Jiahui Song

Abstract:

In order to predict or explain a complicated biological process, it is important first to construct mathematical models that can be used to yield analytical solutions. Through numerical simulation, mathematical model results can be used to test scenarios that might not be easily attained in a laboratory experiment, or to predict parameters or phenomena. High-intensity, nanosecond pulse electroporation has been a recent development in bioelectrics. The dynamic pore model can be achieved by including a dynamic aspect and a dependence on the pore population density into pore formation energy equation to analyze and predict such electroporation effects. For greater accuracy, with inclusion of atomistic details, molecular dynamics (MD) simulations were also carried out during this study. Besides inducing pores in cells, external voltages could also be used in principle to modulate action potential generation in nerves. This could have an application in electrically controlled ‘pain management’. Also a simple model-based rate equation treatment of the various cellular bio-chemical processes has been used to predict the pulse number dependent cell survival trends.

Keywords: model, high-intensity, nanosecond, bioelectrics

Procedia PDF Downloads 204
74 Investigations of Inclusion Complexes of Imazapyr with 2-Hydroxypropyl(β/γ) Cyclodextrin Experimental and Molecular Modeling Approach

Authors: Abdalla A. Elbashir, Maali Saad Mokhtar, FakhrEldin O. Suliman

Abstract:

The inclusion complexes of imazapyr (IMA) with 2-hydroxypropyl(β/γ) cyclodextrins (HP β/γ-CD), have been studied in aqueous media and in the solid state. In this work, fluorescence spectroscopy, electrospray-ionization mass spectrometry (ESI-MS), and HNMR were used to investigate and characterize the inclusion complexes of IMA with the cyclodextrins in solutions. The solid-state complexes were obtained by freeze-drying and were characterized by Fourier transform infrared spectroscopy (FTIR), and powder X-ray diffraction (PXRD). The most predominant complexes of IMA with both hosts are the 1:1 guest: host complexes. The association constants of IMA-HP β-CD and IMA-HP γ -CD were 115 and 215 L mol⁻¹, respectively. Molecular dynamic (MD) simulations were used to monitor the mode of inclusion and also to investigate the stability of these complexes in aqueous media at atomistic levels. The results obtained have indicated that these inclusion complexes are highly stable in aqueous media, thereby corroborating the experimental results. Additionally, it has been demonstrated that in addition to hydrophobic interactions and van der Waals interactions the presence of hydrogen bonding interactions of the type H---O and CH---O between the guest and the host have enhanced the stability of these complexes remarkably.

Keywords: imazapyr, inclusion complex, herbicides, 2-hydroxypropyl-β/γ-cyclodextrin

Procedia PDF Downloads 143
73 Nanostructure and Adhesion of Cement/Polymer Fiber Interfaces

Authors: Faezeh Shalchy

Abstract:

Concrete is the most used materials in the world. It is also one of the most versatile while complex materials which human have used for construction. However, concrete is weak in tension, over the past thirty years many studies were accomplished to improve the tensile properties of concrete (cement-based materials) using a variety of methods. One of the most successful attempts is to use polymeric fibers in the structure of concrete to obtain a composite with high tensile strength and ductility. Understanding the mechanical behavior of fiber reinforced concrete requires the knowledge of the fiber/matrix interfaces at the small scale. In this study, a combination of numerical simulations and experimental techniques have been used to study the nano structure of fiber/matrix interfaces. A new model for calcium-silicate-hydrate (C-S-H)/fiber interfaces is proposed based on Scanning Electron Microscopy (SEM) and Energy-dispersive X-ray spectroscopy (EDX) analysis. The adhesion energy between the C-S-H gel and 2 different polymeric fibers (polyvinyl alcohol and polypropylene) was numerically studied at the atomistic level since adhesion is one of the key factors in the design of fiber reinforced composites. The mechanisms of adhesion as a function of the nano structure of fiber/matrix interfaces are also studied and discussed.

Keywords: fiber-reinforced concrete, adhesion, molecular modeling

Procedia PDF Downloads 307
72 Using Multi-Arm Bandits to Optimize Game Play Metrics and Effective Game Design

Authors: Kenny Raharjo, Ramon Lawrence

Abstract:

Game designers have the challenging task of building games that engage players to spend their time and money on the game. There are an infinite number of game variations and design choices, and it is hard to systematically determine game design choices that will have positive experiences for players. In this work, we demonstrate how multi-arm bandits can be used to automatically explore game design variations to achieve improved player metrics. The advantage of multi-arm bandits is that they allow for continuous experimentation and variation, intrinsically converge to the best solution, and require no special infrastructure to use beyond allowing minor game variations to be deployed to users for evaluation. A user study confirms that applying multi-arm bandits was successful in determining the preferred game variation with highest play time metrics and can be a useful technique in a game designer's toolkit.

Keywords: game design, multi-arm bandit, design exploration and data mining, player metric optimization and analytics

Procedia PDF Downloads 486
71 Transferring of Digital DIY Potentialities through a Co-Design Tool

Authors: Marita Canina, Carmen Bruno

Abstract:

Digital Do It Yourself (DIY) is a contemporary socio-technological phenomenon, enabled by technological tools. The nature and potential long-term effects of this phenomenon have been widely studied within the framework of the EU funded project ‘Digital Do It Yourself’, in which the authors have created and experimented a specific Digital Do It Yourself (DiDIY) co-design process. The phenomenon was first studied through a literature research to understand its multiple dimensions and complexity. Therefore, co-design workshops were used to investigate the phenomenon by involving people to achieve a complete understanding of the DiDIY practices and its enabling factors. These analyses allowed the definition of the DiDIY fundamental factors that were then translated into a design tool. The objective of the tool is to shape design concepts by transferring these factors into different environments to achieve innovation. The aim of this paper is to present the ‘DiDIY Factor Stimuli’ tool, describing the research path and the findings behind it.

Keywords: co-design process, digital DIY, innovation, toolkit

Procedia PDF Downloads 141
70 Data Management and Analytics for Intelligent Grid

Authors: G. Julius P. Roy, Prateek Saxena, Sanjeev Singh

Abstract:

Power distribution utilities two decades ago would collect data from its customers not later than a period of at least one month. The origin of SmartGrid and AMI has subsequently increased the sampling frequency leading to 1000 to 10000 fold increase in data quantity. This increase is notable and this steered to coin the tern Big Data in utilities. Power distribution industry is one of the largest to handle huge and complex data for keeping history and also to turn the data in to significance. Majority of the utilities around the globe are adopting SmartGrid technologies as a mass implementation and are primarily focusing on strategic interdependence and synergies of the big data coming from new information sources like AMI and intelligent SCADA, there is a rising need for new models of data management and resurrected focus on analytics to dissect data into descriptive, predictive and dictatorial subsets. The goal of this paper is to is to bring load disaggregation into smart energy toolkit for commercial usage.

Keywords: data management, analytics, energy data analytics, smart grid, smart utilities

Procedia PDF Downloads 754
69 Soft Pneumatic Actuators Fabricated Using Soluble Polymer Inserts and a Single-Pour System for Improved Durability

Authors: Alexander Harrison Greer, Edward King, Elijah Lee, Safa Obuz, Ruhao Sun, Aditya Sardesai, Toby Ma, Daniel Chow, Bryce Broadus, Calvin Costner, Troy Barnes, Biagio DeSimone, Yeshwin Sankuratri, Yiheng Chen, Holly Golecki

Abstract:

Although a relatively new field, soft robotics is experiencing a rise in applicability in the secondary school setting through The Soft Robotics Toolkit, shared fabrication resources and a design competition. Exposing students outside of university research groups to this rapidly growing field allows for development of the soft robotics industry in new and imaginative ways. Soft robotic actuators have remained difficult to implement in classrooms because of their relative cost or difficulty of fabrication. Traditionally, a two-part molding system is used; however, this configuration often results in delamination. In an effort to make soft robotics more accessible to young students, we aim to develop a simple, single-mold method of fabricating soft robotic actuators from common household materials. These actuators are made by embedding a soluble polymer insert into silicone. These inserts can be made from hand-cut polystyrene, 3D-printed polyvinyl alcohol (PVA) or acrylonitrile butadiene styrene (ABS), or molded sugar. The insert is then dissolved using an appropriate solvent such as water or acetone, leaving behind a negative form which can be pneumatically actuated. The resulting actuators are seamless, eliminating the instability of adhering multiple layers together. The benefit of this approach is twofold: it simplifies the process of creating a soft robotic actuator, and in turn, increases its effectiveness and durability. To quantify the increased durability of the single-mold actuator, it was tested against the traditional two-part mold. The single-mold actuator could withstand actuation at 20psi for 20 times the duration when compared to the traditional method. The ease of fabrication of these actuators makes them more accessible to hobbyists and students in classrooms. After developing these actuators, they were applied, in collaboration with a ceramics teacher at our school, to a glove used to transfer nuanced hand motions used to throw pottery from an expert artist to a novice. We quantified the improvement in the users’ pottery-making skill when wearing the glove using image analysis software. The seamless actuators proved to be robust in this dynamic environment. Seamless soft robotic actuators created by high school students show the applicability of the Soft Robotics Toolkit for secondary STEM education and outreach. Making students aware of what is possible through projects like this will inspire the next generation of innovators in materials science and robotics.

Keywords: pneumatic actuator fabrication, soft robotic glove, soluble polymers, STEM outreach

Procedia PDF Downloads 101
68 Monte Carlo Simulation of Pion Particles

Authors: Reza Reiazi

Abstract:

Attempts to verify Geant4 hadronic physic to transport antiproton beam using standard physics list have not reach to a reasonable results because of lack of reliable cross section data or non reliable model to predict the final states of annihilated particles. Since most of the antiproton annihilation energy is carried away by recoiling nuclear fragments which are result of pions interactions with surrounding nucleons, it should be investigated if the toolkit verified for pions. Geant4 version 9.4.6.p01 was used. Dose calculation was done using 700 MeV pions hitting a water tank applying standards physic lists. We conclude Geant4 standard physics lists to predict the depth dose of Pion minus beam is not same for all investigated models. Since the nuclear fragments will deposit their energy in a small distance, they are the most important source of dose deposition in the annihilation vertex of antiproton beams.

Keywords: Monte Carlo, Pion, simulation, antiproton beam

Procedia PDF Downloads 403
67 A Case Study Approach to the Rate the Eco Sensitivity of Green Infrastructure Solutions

Authors: S. Saroop, D. Allopi

Abstract:

In the area of civil infrastructure, there is an urgent need to apply technologies that deliver infrastructure sustainably in a way that is cost-effective. Civil engineering projects can have a significant impact on ecological and social systems if not correctly planned, designed and implemented. It can impact climate change by addressing the issue of flooding and sustainability. Poor design choices now can result in future generations to live in a climate with depleted resources and without green spaces. The objectives of the research study were to rate the sensitivity of various greener infrastructure technologies that can be used in township infrastructure, at the various stages of the project. This paper discusses the Green Township Infrastructure Design Toolkit, that is used to rate the sustainability of infrastructure service projects. Various case studies were undertaken on a range of infrastructure projects to test the sensitivity of various design solution against sustainability criteria. The Green reporting tools ensure efficient, economical and sustainable provision of infrastructure services.

Keywords: eco-efficiency, green infrastructure, green technology, infrastructure design, sustainable development

Procedia PDF Downloads 354
66 Digital Homeostasis: Tangible Computing as a Multi-Sensory Installation

Authors: Andrea Macruz

Abstract:

This paper explores computation as a process for design by examining how computers can become more than an operative strategy in a designer's toolkit. It documents this, building upon concepts of neuroscience and Antonio Damasio's Homeostasis Theory, which is the control of bodily states through feedback intended to keep conditions favorable for life. To do this, it follows a methodology through algorithmic drawing and discusses the outcomes of three multi-sensory design installations, which culminated from a course in an academic setting. It explains both the studio process that took place to create the installations and the computational process that was developed, related to the fields of algorithmic design and tangible computing. It discusses how designers can use computational range to achieve homeostasis related to sensory data in a multi-sensory installation. The outcomes show clearly how people and computers interact with different sensory modalities and affordances. They propose using computers as meta-physical stabilizers rather than tools.

Keywords: algorithmic drawing, Antonio Damasio, emotion, homeostasis, multi-sensory installation, neuroscience

Procedia PDF Downloads 78
65 Contemplating Charge Transport by Modeling of DNA Nucleobases Based Nano Structures

Authors: Rajan Vohra, Ravinder Singh Sawhney, Kunwar Partap Singh

Abstract:

Electrical charge transport through two basic strands thymine and adenine of DNA have been investigated and analyzed using the jellium model approach. The FFT-2D computations have been performed for semi-empirical Extended Huckel Theory using atomistic tool kit to contemplate the charge transport metrics like current and conductance. The envisaged data is further evaluated in terms of transmission spectrum, HOMO-LUMO Gap and number of electrons. We have scrutinized the behavior of the devices in the range of -2V to 2V for a step size of 0.2V. We observe that both thymine and adenine can act as molecular devices when sandwiched between two gold probes. A prominent observation is a drop in HLGs of adenine and thymine when working as a device as compared to their intrinsic values and this is comparative more visible in case of adenine. The current in the thymine based device exhibit linear increase with voltage in spite of having low conductance. Further, the broader transmission peaks represent the strong coupling of electrodes to the scattering molecule (thymine). Moreover, the observed current in case of thymine is almost 3-4 times than that of observed for adenine. The NDR effect has been perceived in case of adenine based device for higher bias voltages and can be utilized in various future electronics applications.

Keywords: adenine, DNA, extended Huckel, thymine, transmission spectra

Procedia PDF Downloads 124
64 Simulation of Hydrogenated Boron Nitride Nanotube’s Mechanical Properties for Radiation Shielding Applications

Authors: Joseph E. Estevez, Mahdi Ghazizadeh, James G. Ryan, Ajit D. Kelkar

Abstract:

Radiation shielding is an obstacle in long duration space exploration. Boron Nitride Nanotubes (BNNTs) have attracted attention as an additive to radiation shielding material due to B10’s large neutron capture cross section. The B10 has an effective neutron capture cross section suitable for low energy neutrons ranging from 10-5 to 104 eV and hydrogen is effective at slowing down high energy neutrons. Hydrogenated BNNTs are potentially an ideal nanofiller for radiation shielding composites. We use Molecular Dynamics (MD) Simulation via Material Studios Accelrys 6.0 to model the Young’s Modulus of Hydrogenated BNNTs. An extrapolation technique was employed to determine the Young’s Modulus due to the deformation of the nanostructure at its theoretical density. A linear regression was used to extrapolate the data to the theoretical density of 2.62g/cm3. Simulation data shows that the hydrogenated BNNTs will experience a 11% decrease in the Young’s Modulus for (6,6) BNNTs and 8.5% decrease for (8,8) BNNTs compared to non-hydrogenated BNNT’s. Hydrogenated BNNTs are a viable option as a nanofiller for radiation shielding nanocomposite materials for long range and long duration space exploration.

Keywords: boron nitride nanotube, radiation shielding, young modulus, atomistic modeling

Procedia PDF Downloads 265
63 Hierarchical Queue-Based Task Scheduling with CloudSim

Authors: Wanqing You, Kai Qian, Ying Qian

Abstract:

The concepts of Cloud Computing provide users with infrastructure, platform and software as service, which make those services more accessible for people via Internet. To better analysis the performance of Cloud Computing provisioning policies as well as resources allocation strategies, a toolkit named CloudSim proposed. With CloudSim, the Cloud Computing environment can be easily constructed by modelling and simulating cloud computing components, such as datacenter, host, and virtual machine. A good scheduling strategy is the key to achieve the load balancing among different machines as well as to improve the utilization of basic resources. Recently, the existing scheduling algorithms may work well in some presumptive cases in a single machine; however they are unable to make the best decision for the unforeseen future. In real world scenario, there would be numbers of tasks as well as several virtual machines working in parallel. Based on the concepts of multi-queue, this paper presents a new scheduling algorithm to schedule tasks with CloudSim by taking into account several parameters, the machines’ capacity, the priority of tasks and the history log.

Keywords: hierarchical queue, load balancing, CloudSim, information technology

Procedia PDF Downloads 395
62 The Impact of Missense Mutation in Phosphatidylinositol Glycan Class A Associated to Paroxysmal Nocturnal Hemoglobinuria and Multiple Congenital Anomalies-Hypotonia-Seizures Syndrome 2: A Computational Study

Authors: Ashish Kumar Agrahari, Amit Kumar

Abstract:

Paroxysmal nocturnal hemoglobinuria (PNH) is an acquired clonal blood disorder that manifests with hemolytic anemia, thrombosis, and peripheral blood cytopenias. The disease is caused by the deficiency of two glycosylphosphatidylinositols (GPI)-anchored proteins (CD55 and CD59) in the hemopoietic stem cells. The deficiency of GPI-anchored proteins has been associated with the somatic mutations in phosphatidylinositol glycan class A (PIGA). However, the mutations that do not cause PNH is associated with the multiple congenital anomalies-hypotonia-seizures syndrome 2 (MCAHS2). To best of our knowledge, no computational study has been performed to explore the atomistic level impact of PIGA mutations on the structure and dynamics of the protein. In the current work, we are mainly interested to get insights into the molecular mechanism of PIGA mutations. In the initial step, we screened the most pathogenic mutations from the pool of publicly available mutations. Further, to get a better understanding, pathogenic mutations were mapped to the modeled structure and subjected to 50ns molecular dynamics simulation. Our computational study suggests that four mutations are highly vulnerable to altering the structural conformation and stability of the PIGA protein, which illustrates its association with PNH and MCAHS2 phenotype.

Keywords: homology modeling, molecular dynamics simulation, missense mutations PNH, MCAHS2, PIGA

Procedia PDF Downloads 121
61 The Concept of Development: A Normative Restructured Model in the Light of Indian Political Thought and Classical Liberalism

Authors: Sarthak S. Salunke

Abstract:

Development, as a notion, is seen in perspective of western philosophical conceptions, and the western developed nations have become a yardstick for setting up development goals for developing and underdeveloped nations around the world. This blanket term of development becomes superficial and materialistic in context of the vast geopolitical, territorial, cultural and behavioral diversities existing in countries of the Africa and the Asia, and tends to undermine the atomistic aspect of development. Indian political theories, which are often seen as religious philosophies, have inherent structure of development of human being as an individual and as a part of the society, and, in result, development of the State. These theories, primarily individualistic in nature, have a combination of altruism and rationalism which guides human beings towards constructing a collectively developed and morally sustainable society. This research focuses on the application of this Indian thought in combination of classical liberal thought to tackle the issues of development in diverse societies. The proposed restructured model of development is based on molecular individualism, instead of atomic individual approach of liberalists, which lets development modelers to target meaningful clusters for designating goals for development based on the particular needs based on geopolitical, cultural and ethical requirements, and making it meaningful in conjunction with global development to establish a harmony between western and eastern worlds.

Keywords: Indian political thought, development, liberalism, molecular individualism

Procedia PDF Downloads 162
60 High Purity Germanium Detector Characterization by Means of Monte Carlo Simulation through Application of Geant4 Toolkit

Authors: Milos Travar, Jovana Nikolov, Andrej Vranicar, Natasa Todorovic

Abstract:

Over the years, High Purity Germanium (HPGe) detectors proved to be an excellent practical tool and, as such, have established their today's wide use in low background γ-spectrometry. One of the advantages of gamma-ray spectrometry is its easy sample preparation as chemical processing and separation of the studied subject are not required. Thus, with a single measurement, one can simultaneously perform both qualitative and quantitative analysis. One of the most prominent features of HPGe detectors, besides their excellent efficiency, is their superior resolution. This feature virtually allows a researcher to perform a thorough analysis by discriminating photons of similar energies in the studied spectra where otherwise they would superimpose within a single-energy peak and, as such, could potentially scathe analysis and produce wrongly assessed results. Naturally, this feature is of great importance when the identification of radionuclides, as well as their activity concentrations, is being practiced where high precision comes as a necessity. In measurements of this nature, in order to be able to reproduce good and trustworthy results, one has to have initially performed an adequate full-energy peak (FEP) efficiency calibration of the used equipment. However, experimental determination of the response, i.e., efficiency curves for a given detector-sample configuration and its geometry, is not always easy and requires a certain set of reference calibration sources in order to account for and cover broader energy ranges of interest. With the goal of overcoming these difficulties, a lot of researches turned towards the application of different software toolkits that implement the Monte Carlo method (e.g., MCNP, FLUKA, PENELOPE, Geant4, etc.), as it has proven time and time again to be a very powerful tool. In the process of creating a reliable model, one has to have well-established and described specifications of the detector. Unfortunately, the documentation that manufacturers provide alongside the equipment is rarely sufficient enough for this purpose. Furthermore, certain parameters tend to evolve and change over time, especially with older equipment. Deterioration of these parameters consequently decreases the active volume of the crystal and can thus affect the efficiencies by a large margin if they are not properly taken into account. In this study, the optimisation method of two HPGe detectors through the implementation of the Geant4 toolkit developed by CERN is described, with the goal of further improving simulation accuracy in calculations of FEP efficiencies by investigating the influence of certain detector variables (e.g., crystal-to-window distance, dead layer thicknesses, inner crystal’s void dimensions, etc.). Detectors on which the optimisation procedures were carried out were a standard traditional co-axial extended range detector (XtRa HPGe, CANBERRA) and a broad energy range planar detector (BEGe, CANBERRA). Optimised models were verified through comparison with experimentally obtained data from measurements of a set of point-like radioactive sources. Acquired results of both detectors displayed good agreement with experimental data that falls under an average statistical uncertainty of ∼ 4.6% for XtRa and ∼ 1.8% for BEGe detector within the energy range of 59.4−1836.1 [keV] and 59.4−1212.9 [keV], respectively.

Keywords: HPGe detector, γ spectrometry, efficiency, Geant4 simulation, Monte Carlo method

Procedia PDF Downloads 88
59 Evolution of Performance Measurement Methods in Conditions of Uncertainty: The Implementation of Fuzzy Sets in Performance Measurement

Authors: E. A. Tkachenko, E. M. Rogova, V. V. Klimov

Abstract:

One of the basic issues of development management is connected with performance measurement as a prerequisite for identifying the achievement of development objectives. The aim of our research is to develop an improved model of assessing a company’s development results. The model should take into account the cyclical nature of development and the high degree of uncertainty in dealing with numerous management tasks. Our hypotheses may be formulated as follows: Hypothesis 1. The cycle of a company’s development may be studied from the standpoint of a project cycle. To do that, methods and tools of project analysis are to be used. Hypothesis 2. The problem of the uncertainty when justifying managerial decisions within the framework of a company’s development cycle can be solved through the use of the mathematical apparatus of fuzzy logic. The reasoned justification of the validity of the hypotheses made is given in the suggested article. The fuzzy logic toolkit applies to the case of technology shift within an enterprise. It is proven that some restrictions in performance measurement that are incurred to conventional methods could be eliminated by implementation of the fuzzy logic apparatus in performance measurement models.

Keywords: logic, fuzzy sets, performance measurement, project analysis

Procedia PDF Downloads 347
58 Using Maximization Entropy in Developing a Filipino Phonetically Balanced Wordlist for a Phoneme-Level Speech Recognition System

Authors: John Lorenzo Bautista, Yoon-Joong Kim

Abstract:

In this paper, a set of Filipino Phonetically Balanced Word list consisting of 250 words (PBW250) were constructed for a phoneme-level ASR system for the Filipino language. The Entropy Maximization is used to obtain phonological balance in the list. Entropy of phonemes in a word is maximized, providing an optimal balance in each word’s phonological distribution using the Add-Delete Method (PBW algorithm) and is compared to the modified PBW algorithm implemented in a dynamic algorithm approach to obtain optimization. The gained entropy score of 4.2791 and 4.2902 for the PBW and modified algorithm respectively. The PBW250 was recorded by 40 respondents, each with 2 sets data. Recordings from 30 respondents were trained to produce an acoustic model that were tested using recordings from 10 respondents using the HMM Toolkit (HTK). The results of test gave the maximum accuracy rate of 97.77% for a speaker dependent test and 89.36% for a speaker independent test.

Keywords: entropy maximization, Filipino language, Hidden Markov Model, phonetically balanced words, speech recognition

Procedia PDF Downloads 430
57 Molecular Interactions Driving RNA Binding to hnRNPA1 Implicated in Neurodegeneration

Authors: Sakina Fatima, Joseph-Patrick W. E. Clarke, Patricia A. Thibault, Subha Kalyaanamoorthy, Michael Levin, Aravindhan Ganesan

Abstract:

Heteronuclear ribonucleoprotein (hnRNPA1 or A1) is associated with the pathology of different diseases, including neurological disorders and cancers. In particular, the aggregation and dysfunction of A1 have been identified as a critical driver for neurodegeneration (NDG) in Multiple Sclerosis (MS). Structurally, A1 includes a low-complexity domain (LCD) and two RNA-recognition motifs (RRMs), and their interdomain coordination may play a crucial role in A1 aggregation. Previous studies propose that RNA-inhibitors or nucleoside analogs that bind to RRMs can potentially prevent A1 self-association. Therefore, molecular-level understanding of the structures, dynamics, and nucleotide interactions with A1 RRMs can be useful for developing therapeutics for NDG in MS. In this work, a combination of computational modelling and biochemical experiments were employed to analyze a set of RNA-A1 RRM complexes. Initially, the atomistic models of RNA-RRM complexes were constructed by modifying known crystal structures (e.g., PDBs: 4YOE and 5MPG), and through molecular docking calculations. The complexes were optimized using molecular dynamics simulations (200-400 ns), and their binding free energies were computed. The binding affinities of the selected complexes were validated using a thermal shift assay. Further, the most important molecular interactions that contributed to the overall stability of the RNA-A1 RRM complexes were deduced. The results highlight that adenine and guanine are the most suitable nucleotides for high-affinity binding with A1. These insights will be useful in the rational design of nucleotide-analogs for targeting A1 RRMs.

Keywords: hnRNPA1, molecular docking, molecular dynamics, RNA-binding proteins

Procedia PDF Downloads 88
56 i2kit: A Tool for Immutable Infrastructure Deployments

Authors: Pablo Chico De Guzman, Cesar Sanchez

Abstract:

Microservice architectures are increasingly in distributed cloud applications due to the advantages on the software composition, development speed, release cycle frequency and the business logic time to market. On the other hand, these architectures also introduce some challenges on the testing and release phases of applications. Container technology solves some of these issues by providing reproducible environments, easy of software distribution and isolation of processes. However, there are other issues that remain unsolved in current container technology when dealing with multiple machines, such as networking for multi-host communication, service discovery, load balancing or data persistency (even though some of these challenges are already solved by traditional cloud vendors in a very mature and widespread manner). Container cluster management tools, such as Kubernetes, Mesos or Docker Swarm, attempt to solve these problems by introducing a new control layer where the unit of deployment is the container (or the pod — a set of strongly related containers that must be deployed on the same machine). These tools are complex to configure and manage and they do not follow a pure immutable infrastructure approach since servers are reused between deployments. Indeed, these tools introduce dependencies at execution time for solving networking or service discovery problems. If an error on the control layer occurs, which would affect running applications, specific expertise is required to perform ad-hoc troubleshooting. As a consequence, it is not surprising that container cluster support is becoming a source of revenue for consulting services. This paper presents i2kit, a deployment tool based on the immutable infrastructure pattern, where the virtual machine is the unit of deployment. The input for i2kit is a declarative definition of a set of microservices, where each microservice is defined as a pod of containers. Microservices are built into machine images using linuxkit —- a tool for creating minimal linux distributions specialized in running containers. These machine images are then deployed to one or more virtual machines, which are exposed through a cloud vendor load balancer. Finally, the load balancer endpoint is set into other microservices using an environment variable, providing service discovery. The toolkit i2kit reuses the best ideas from container technology to solve problems like reproducible environments, process isolation, and software distribution, and at the same time relies on mature, proven cloud vendor technology for networking, load balancing and persistency. The result is a more robust system with no learning curve for troubleshooting running applications. We have implemented an open source prototype that transforms i2kit definitions into AWS cloud formation templates, where each microservice AMI (Amazon Machine Image) is created on the fly using linuxkit. Even though container cluster management tools have more flexibility for resource allocation optimization, we defend that adding a new control layer implies more important disadvantages. Resource allocation is greatly improved by using linuxkit, which introduces a very small footprint (around 35MB). Also, the system is more secure since linuxkit installs the minimum set of dependencies to run containers. The toolkit i2kit is currently under development at the IMDEA Software Institute.

Keywords: container, deployment, immutable infrastructure, microservice

Procedia PDF Downloads 152