Search results for: structural kinetic model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19850

Search results for: structural kinetic model

7640 Effect of the Applied Bias on Miniband Structures in Dimer Fibonacci Inas/Ga1-Xinxas Superlattices

Authors: Z. Aziz, S. Terkhi, Y. Sefir, R. Djelti, S. Bentata

Abstract:

The effect of a uniform electric field across multibarrier systems (InAs/InxGa1-xAs) is exhaustively explored by a computational model using exact airy function formalism and the transfer-matrix technique. In the case of biased DFHBSL structure a strong reduction in transmission properties was observed and the width of the miniband structure linearly decreases with the increase of the applied bias. This is due to the confinement of the states in the miniband structure, which becomes increasingly important (Wannier-Stark Effect).

Keywords: dimer fibonacci height barrier superlattices, singular extended state, exact airy function, transfer matrix formalism

Procedia PDF Downloads 289
7639 Study of Oxidative Processes in Blood Serum in Patients with Arterial Hypertension

Authors: Laura M. Hovsepyan, Gayane S. Ghazaryan, Hasmik V. Zanginyan

Abstract:

Hypertension (HD) is the most common cardiovascular pathology that causes disability and mortality in the working population. Most often, heart failure (HF), which is based on myocardial remodeling, leads to death in hypertension. Recently, endothelial dysfunction (EDF) or a violation of the functional state of the vascular endothelium has been assigned a significant role in the structural changes in the myocardium and the occurrence of heart failure in patients with hypertension. It has now been established that tissues affected by inflammation form increased amounts of superoxide radical and NO, which play a significant role in the development and pathogenesis of various pathologies. They mediate inflammation, modify proteins and damage nucleic acids. The aim of this work was to study the processes of oxidative modification of proteins (OMP) and the production of nitric oxide in hypertension. In the experimental work, the blood of 30 donors and 33 patients with hypertension was used. For the quantitative determination of OMP products, the based on the reaction of the interaction of oxidized amino acid residues of proteins and 2,4-dinitrophenylhydrazine (DNPH) with the formation of 2,4-dinitrophenylhydrazones, the amount of which was determined spectrophotometrically. The optical density of the formed carbonyl derivatives of dinitrophenylhydrazones was recorded at different wavelengths: 356 nm - aliphatic ketone dinitrophenylhydrazones (KDNPH) of neutral character; 370 nm - aliphatic aldehyde dinirophenylhydrazones (ADNPH) of neutral character; 430 nm - aliphatic KDNFG of the main character; 530 nm - basic aliphatic ADNPH. Nitric oxide was determined by photometry using Grace's solution. Adsorption was measured on a Thermo Scientific Evolution 201 SF at a wavelength of 546 nm. Thus, the results of the studies showed that in patients with arterial hypertension, an increased level of nitric oxide in the blood serum is observed, and there is also a tendency to an increase in the intensity of oxidative modification of proteins at a wavelength of 270 nm and 363 nm, which indicates a statistically significant increase in aliphatic aldehyde and ketone dinitrophenylhydrazones. The increase in the intensity of oxidative modification of blood plasma proteins in the studied patients, revealed by us, actually reflects the general direction of free radical processes and, in particular, the oxidation of proteins throughout the body. A decrease in the activity of the antioxidant system also leads to a violation of protein metabolism. The most important consequence of the oxidative modification of proteins is the inactivation of enzymes.

Keywords: hypertension (HD), oxidative modification of proteins (OMP), nitric oxide (NO), oxidative stress

Procedia PDF Downloads 83
7638 Parallels between Training Parameters of High-Performance Athletes Determining the Long-Term Adaptation of the Body in Various Sports: Case Study on Different Types of Training and Their Gender Conditioning

Authors: Gheorghe Braniste

Abstract:

Gender gap has always been in dispute when comparing records and has been a major factor influencing best performances in various sports. Consequently, our study registers the evolution of the difference between men's and women’s best performances within either cyclic or acyclic sports, considering the fact that the training sessions of high performance athletes prove both similarities and differences in long-term adaptation of their body to stress and effort in breaking limits and records. Firstly, for a correct interpretation of the data and tables included in this paper, we must point out that the intense muscular activity has a considerable impact on the structural organization of the organs and systems of the performer's body through the mechanism of motor-visceral reflexes, forming a high working capacity suitable for intense muscular activity. The opportunity to obtaine high sports results during the official competitions is due, on the one hand, to the genetic characteristics of the athlete's body, and on the other hand, to the fact that playing professional sports leaves its mark on the vital morphological and functional parameters. The aim of our research is to study the landmarking differences between male and female athletes and their physical development, together with their growing capacity to stand up to the functional training during the competitive period of their annual training cycle. In order to evaluate the physical development of the athletes, the data of the anthropometric screenings obtained at the Olympic Training Center of the selected teams of the Republic of Moldova were interpreted and rated. During the study of physical development in terms of body height and weight, vital capacity, thoracic excursion, maximum force (Fmax), dynamometry of the hand and back, a further evaluation of the physical development indices that allow an evaluation of complex physical development were registered. The interdependence of the results obtained in performance sports with the morphological and functional particularities of the athletes' body is firmly determined and cannot be disputed. Nevertheless, registered data proved that with the increase of the training capacity, the morphological and functional abilities of the female body increase and, in some respects, approach and even slightly surpass the men in certain sports.

Keywords: physical development, indices, parameters, active body weight, morphological maturity, physical performance

Procedia PDF Downloads 101
7637 A Delphi Study of Factors Affecting the Forest Biorefinery Development in the Pulp and Paper Industry: The Case of Bio-Based Products

Authors: Natasha Gabriella, Josef-Peter Schöggl, Alfred Posch

Abstract:

Being a mature industry, pulp and paper industry (PPI) possess strength points coming from its existing infrastructure, technology know-how, and abundant availability of biomass. However, the declining trend of the wood-based products sales sends a clear signal to the industry to transform its business model in order to increase its profitability. With the emerging global attention on bio-based economy and circular economy, coupled with the low price of fossil feedstock, the PPI starts to integrate biorefinery as a value-added business model to keep the industry’s competitiveness. Nonetheless, biorefinery as an innovation exposes the PPI with some barriers, of which the uncertainty of the promising product becomes one of the major hurdles. This study aims to assess factors that affect the diffusion and development of forest biorefinery in the PPI, including drivers, barriers, advantages, disadvantages, as well as the most promising bio-based products of forest biorefinery. The study examines the identified factors according to the layer of business environment, being the macro-environment, industry, and strategic group level. Besides, an overview of future state of the identified factors is elaborated as to map necessary improvements for implementing forest biorefinery. A two-phase Delphi method is used to collect the empirical data for the study, comprising of an online-based survey and interviews. Delphi method is an effective communication tools to elicit ideas from a group of experts to further reach a consensus of forecasting future trends. Collaborating a total of 50 experts in the panel, the study reveals that influential factors are found in every layers of business of the PPI. The politic dimension is apparent to have a significant influence for tackling the economy barrier while reinforcing the environmental and social benefits in the macro-environment. In the industry level, the biomass availability appears to be a strength point of the PPI while the knowledge gap on technology and market seem to be barriers. Consequently, cooperation with academia and the chemical industry has to be improved. Human resources issue is indicated as one important premise behind the preceding barrier, along with the indication of the PPI’s resistance towards biorefinery implementation as an innovation. Further, cellulose-based products are acknowledged for near-term product development whereas lignin-based products are emphasized to gain importance in the long-term future.

Keywords: forest biorefinery, pulp and paper, bio-based product, Delphi method

Procedia PDF Downloads 262
7636 Synchrotron Radiation and Inverse Compton Scattering in Astrophysical Plasma

Authors: S. S. Sathiesh

Abstract:

The aim of this project is to study the radiation mechanism synchrotron and Inverse Compton scattering. Theoretically, we discussed spectral energy distribution for both. Programming is done for plotting the graph of Power-law spectrum for synchrotron Radiation using fortran90. The importance of power law spectrum was discussed and studied to infer its physical parameters from the model fitting. We also discussed how to infer the physical parameters from the theoretically drawn graph, we have seen how one can infer B (magnetic field of the source), γ min, γ max, spectral indices (p1, p2) while fitting the curve to the observed data.

Keywords: blazars/quasars, beaming, synchrotron radiation, Synchrotron Self Compton, inverse Compton scattering, mrk421

Procedia PDF Downloads 406
7635 3D Finite Element Analysis of Yoke Hybrid Electromagnet

Authors: Hasan Fatih Ertuğrul, Beytullah Okur, Huseyin Üvet, Kadir Erkan

Abstract:

The objective of this paper is to analyze a 4-pole hybrid magnetic levitation system by using 3D finite element and analytical methods. The magnetostatic analysis of the system is carried out by using ANSYS MAXWELL-3D package. An analytical model is derived by magnetic equivalent circuit (MEC) method. The purpose of magnetostatic analysis is to determine the characteristics of attractive force and rotational torques by the change of air gap clearances, inclination angles and current excitations. The comparison between 3D finite element analysis and analytical results are presented at the rest of the paper.

Keywords: yoke hybrid electromagnet, 3D finite element analysis, magnetic levitation system, magnetostatic analysis

Procedia PDF Downloads 711
7634 Implementing a Database from a Requirement Specification

Authors: M. Omer, D. Wilson

Abstract:

Creating a database scheme is essentially a manual process. From a requirement specification, the information contained within has to be analyzed and reduced into a set of tables, attributes and relationships. This is a time-consuming process that has to go through several stages before an acceptable database schema is achieved. The purpose of this paper is to implement a Natural Language Processing (NLP) based tool to produce a from a requirement specification. The Stanford CoreNLP version 3.3.1 and the Java programming were used to implement the proposed model. The outcome of this study indicates that the first draft of a relational database schema can be extracted from a requirement specification by using NLP tools and techniques with minimum user intervention. Therefore, this method is a step forward in finding a solution that requires little or no user intervention.

Keywords: information extraction, natural language processing, relation extraction

Procedia PDF Downloads 248
7633 Reconceptualizing “Best Practices” in Public Sector

Authors: Eftychia Kessopoulou, Styliani Xanthopoulou, Ypatia Theodorakioglou, George Tsiotras, Katerina Gotzamani

Abstract:

Public sector managers frequently herald that implementing best practices as a set of standards, may lead to superior organizational performance. However, recent research questions the objectification of best practices, highlighting: a) the inability of public sector organizations to develop innovative administrative practices, as well as b) the adoption of stereotypical renowned practices inculcated in the public sector by international governance bodies. The process through which organizations construe what a best practice is, still remains a black box that is yet to be investigated, given the trend of continuous changes in public sector performance, as well as the burgeoning interest of sharing popular administrative practices put forward by international bodies. This study aims to describe and understand how organizational best practices are constructed by public sector performance management teams, like benchmarkers, during the benchmarking-mediated performance improvement process and what mechanisms enable this construction. A critical realist action research methodology is employed, starting from a description of various approaches on best practice nature when a benchmarking-mediated performance improvement initiative, such as the Common Assessment Framework, is applied. Firstly, we observed the benchmarker’s management process of best practices in a public organization, so as to map their theories-in-use. As a second step we contextualized best administrative practices by reflecting the different perspectives emerged from the previous stage on the design and implementation of an interview protocol. We used this protocol to conduct 30 semi-structured interviews with “best practice” process owners, in order to examine their experiences and performance needs. Previous research on best practices has shown that needs and intentions of benchmarkers cannot be detached from the causal mechanisms of the various contexts in which they work. Such causal mechanisms can be found in: a) process owner capabilities, b) the structural context of the organization, and c) state regulations. Therefore, we developed an interview protocol theoretically informed in the first part to spot causal mechanisms suggested by previous research studies and supplemented it with questions regarding the provision of best practice support from the government. Findings of this work include: a) a causal account of the nature of best administrative practices in the Greek public sector that shed light on explaining their management, b) a description of the various contexts affecting best practice conceptualization, and c) a description of how their interplay changed the organization’s best practice management.

Keywords: benchmarking, action research, critical realism, best practices, public sector

Procedia PDF Downloads 113
7632 Deep Reinforcement Learning Approach for Trading Automation in The Stock Market

Authors: Taylan Kabbani, Ekrem Duman

Abstract:

The design of adaptive systems that take advantage of financial markets while reducing the risk can bring more stagnant wealth into the global market. However, most efforts made to generate successful deals in trading financial assets rely on Supervised Learning (SL), which suffered from various limitations. Deep Reinforcement Learning (DRL) offers to solve these drawbacks of SL approaches by combining the financial assets price "prediction" step and the "allocation" step of the portfolio in one unified process to produce fully autonomous systems capable of interacting with its environment to make optimal decisions through trial and error. In this paper, a continuous action space approach is adopted to give the trading agent the ability to gradually adjust the portfolio's positions with each time step (dynamically re-allocate investments), resulting in better agent-environment interaction and faster convergence of the learning process. In addition, the approach supports the managing of a portfolio with several assets instead of a single one. This work represents a novel DRL model to generate profitable trades in the stock market, effectively overcoming the limitations of supervised learning approaches. We formulate the trading problem, or what is referred to as The Agent Environment as Partially observed Markov Decision Process (POMDP) model, considering the constraints imposed by the stock market, such as liquidity and transaction costs. More specifically, we design an environment that simulates the real-world trading process by augmenting the state representation with ten different technical indicators and sentiment analysis of news articles for each stock. We then solve the formulated POMDP problem using the Twin Delayed Deep Deterministic Policy Gradient (TD3) algorithm, which can learn policies in high-dimensional and continuous action spaces like those typically found in the stock market environment. From the point of view of stock market forecasting and the intelligent decision-making mechanism, this paper demonstrates the superiority of deep reinforcement learning in financial markets over other types of machine learning such as supervised learning and proves its credibility and advantages of strategic decision-making.

Keywords: the stock market, deep reinforcement learning, MDP, twin delayed deep deterministic policy gradient, sentiment analysis, technical indicators, autonomous agent

Procedia PDF Downloads 167
7631 Architecture of a Preliminary Course on Computational Thinking

Authors: Mintu Philip, Renumol V. G.

Abstract:

An introductory programming course is a major challenge faced in Computing Education. Many of the introductory programming courses fail because student concentrate mainly on writing programs using a programming language rather than involving in problem solving. Computational thinking is a general approach to solve problems. This paper proposes a new preliminary course that aims to develop computational thinking skills in students, which may help them to become good programmers. The proposed course is designed based on the four basic components of computational thinking - abstract thinking, logical thinking, modeling thinking and constructive thinking. In this course, students are engaged in hands-on problem solving activities using a new problem solving model proposed in this paper.

Keywords: computational thinking, computing education, abstraction, constructive thinking, modelling thinking

Procedia PDF Downloads 441
7630 User Experience Measurement of User Interfaces

Authors: Mohammad Hashemi, John Herbert

Abstract:

Quantifying and measuring Quality of Experience (QoE) are important and difficult concerns in Human Computer Interaction (HCI). Quality of Service (QoS) and the actual User Interface (UI) of the application are both important contributors to the QoE of a user. This paper describes a framework that measures accurately the way a user uses the UI in order to model users' behaviours and profiles. It monitors the use of the mouse and use of UI elements with accurate time measurement. It does this in real-time and does so unobtrusively and efficiently allowing the user to work as normal with the application. This real-time accurate measurement of the user's interaction provides valuable data and insight into the use of the UI, and is also the basis for analysis of the user's QoE.

Keywords: user modelling, user interface experience, quality of experience, user experience, human and computer interaction

Procedia PDF Downloads 488
7629 Linguistic Analysis of Argumentation Structures in Georgian Political Speeches

Authors: Mariam Matiashvili

Abstract:

Argumentation is an integral part of our daily communications - formal or informal. Argumentative reasoning, techniques, and language tools are used both in personal conversations and in the business environment. Verbalization of the opinions requires the use of extraordinary syntactic-pragmatic structural quantities - arguments that add credibility to the statement. The study of argumentative structures allows us to identify the linguistic features that make the text argumentative. Knowing what elements make up an argumentative text in a particular language helps the users of that language improve their skills. Also, natural language processing (NLP) has become especially relevant recently. In this context, one of the main emphases is on the computational processing of argumentative texts, which will enable the automatic recognition and analysis of large volumes of textual data. The research deals with the linguistic analysis of the argumentative structures of Georgian political speeches - particularly the linguistic structure, characteristics, and functions of the parts of the argumentative text - claims, support, and attack statements. The research aims to describe the linguistic cues that give the sentence a judgmental/controversial character and helps to identify reasoning parts of the argumentative text. The empirical data comes from the Georgian Political Corpus, particularly TV debates. Consequently, the texts are of a dialogical nature, representing a discussion between two or more people (most often between a journalist and a politician). The research uses the following approaches to identify and analyze the argumentative structures Lexical Classification & Analysis - Identify lexical items that are relevant in argumentative texts creating process - Creating the lexicon of argumentation (presents groups of words gathered from a semantic point of view); Grammatical Analysis and Classification - means grammatical analysis of the words and phrases identified based on the arguing lexicon. Argumentation Schemas - Describe and identify the Argumentation Schemes that are most likely used in Georgian Political Speeches. As a final step, we analyzed the relations between the above mentioned components. For example, If an identified argument scheme is “Argument from Analogy”, identified lexical items semantically express analogy too, and they are most likely adverbs in Georgian. As a result, we created the lexicon with the words that play a significant role in creating Georgian argumentative structures. Linguistic analysis has shown that verbs play a crucial role in creating argumentative structures.

Keywords: georgian, argumentation schemas, argumentation structures, argumentation lexicon

Procedia PDF Downloads 58
7628 GIS Pavement Maintenance Selection Strategy

Authors: Mekdelawit Teferi Alamirew

Abstract:

As a practical tool, the Geographical information system (GIS) was used for data integration, collection, management, analysis, and output presentation in pavement mangement systems . There are many GIS techniques to improve the maintenance activities like Dynamic segmentation and weighted overlay analysis which considers Multi Criteria Decision Making process. The results indicated that the developed MPI model works sufficiently and yields adequate output for providing accurate decisions. Hence considering multi criteria to prioritize the pavement sections for maintenance, as a result of the fact that GIS maps can express position, extent, and severity of pavement distress features more effectively than manual approaches, lastly the paper also offers digitized distress maps that can help agencies in their decision-making processes.

Keywords: pavement, flexible, maintenance, index

Procedia PDF Downloads 48
7627 Modeling Methodologies for Optimization and Decision Support on Coastal Transport Information System (Co.Tr.I.S.)

Authors: Vassilios Moussas, Dimos N. Pantazis, Panagioths Stratakis

Abstract:

The aim of this paper is to present the optimization methodology developed in the frame of a Coastal Transport Information System. The system will be used for the effective design of coastal transportation lines and incorporates subsystems that implement models, tools and techniques that may support the design of improved networks. The role of the optimization and decision subsystem is to provide the user with better and optimal scenarios that will best fulfill any constrains, goals or requirements posed. The complexity of the problem and the large number of parameters and objectives involved led to the adoption of an evolutionary method (Genetic Algorithms). The problem model and the subsystem structure are presented in detail, and, its support for simulation is also discussed.

Keywords: coastal transport, modeling, optimization

Procedia PDF Downloads 484
7626 Risk Identification of Investment Feasibility in Indonesia’s Toll Road Infrastructure Investment

Authors: Christo Februanto Putra

Abstract:

This paper presents risk identification that affects investment feasibility on toll road infrastructure in Indonesia using qualitative methods survey based on the expert practitioner in investor, contractor, and state officials. The problems on infrastructure investment in Indonesia, especially on KPBU model contract, is many risk factors in the investment plan is not calculated in detail thoroughly. Risk factor is a value used to provide an overview of the risk level assessment of an event which is a function of the probability of the occurrence and the consequences of the risks that arise. As results of the survey which is to show which risk factors impacts directly to the investment feasibility and rank them by their impacts on the investment.

Keywords: risk identification, indonesia toll road, investment feasibility

Procedia PDF Downloads 263
7625 Detection and Identification of Antibiotic Resistant Bacteria Using Infra-Red-Microscopy and Advanced Multivariate Analysis

Authors: Uraib Sharaha, Ahmad Salman, Eladio Rodriguez-Diaz, Elad Shufan, Klaris Riesenberg, Irving J. Bigio, Mahmoud Huleihel

Abstract:

Antimicrobial drugs have an important role in controlling illness associated with infectious diseases in animals and humans. However, the increasing resistance of bacteria to a broad spectrum of commonly used antibiotics has become a global health-care problem. Rapid determination of antimicrobial susceptibility of a clinical isolate is often crucial for the optimal antimicrobial therapy of infected patients and in many cases can save lives. The conventional methods for susceptibility testing like disk diffusion are time-consuming and other method including E-test, genotyping are relatively expensive. Fourier transform infrared (FTIR) microscopy is rapid, safe, and low cost method that was widely and successfully used in different studies for the identification of various biological samples including bacteria. The new modern infrared (IR) spectrometers with high spectral resolution enable measuring unprecedented biochemical information from cells at the molecular level. Moreover, the development of new bioinformatics analyses combined with IR spectroscopy becomes a powerful technique, which enables the detection of structural changes associated with resistivity. The main goal of this study is to evaluate the potential of the FTIR microscopy in tandem with machine learning algorithms for rapid and reliable identification of bacterial susceptibility to antibiotics in time span of few minutes. The bacterial samples, which were identified at the species level by MALDI-TOF and examined for their susceptibility by the routine assay (micro-diffusion discs), are obtained from the bacteriology laboratories in Soroka University Medical Center (SUMC). These samples were examined by FTIR microscopy and analyzed by advanced statistical methods. Our results, based on 550 E.coli samples, were promising and showed that by using infrared spectroscopic technique together with multivariate analysis, it is possible to classify the tested bacteria into sensitive and resistant with success rate higher than 85% for eight different antibiotics. Based on these preliminary results, it is worthwhile to continue developing the FTIR microscopy technique as a rapid and reliable method for identification antibiotic susceptibility.

Keywords: antibiotics, E. coli, FTIR, multivariate analysis, susceptibility

Procedia PDF Downloads 248
7624 An Investigation into the Influence of Compression on 3D Woven Preform Thickness and Architecture

Authors: Calvin Ralph, Edward Archer, Alistair McIlhagger

Abstract:

3D woven textile composites continue to emerge as an advanced material for structural applications and composite manufacture due to their bespoke nature, through thickness reinforcement and near net shape capabilities. When 3D woven preforms are produced, they are in their optimal physical state. As 3D weaving is a dry preforming technology it relies on compression of the preform to achieve the desired composite thickness, fibre volume fraction (Vf) and consolidation. This compression of the preform during manufacture results in changes to its thickness and architecture which can often lead to under-performance or changes of the 3D woven composite. Unlike traditional 2D fabrics, the bespoke nature and variability of 3D woven architectures makes it difficult to know exactly how each 3D preform will behave during processing. Therefore, the focus of this study is to investigate the effect of compression on differing 3D woven architectures in terms of structure, crimp or fibre waviness and thickness as well as analysing the accuracy of available software to predict how 3D woven preforms behave under compression. To achieve this, 3D preforms are modelled and compression simulated in Wisetex with varying architectures of binder style, pick density, thickness and tow size. These architectures have then been woven with samples dry compression tested to determine the compressibility of the preforms under various pressures. Additional preform samples were manufactured using Resin Transfer Moulding (RTM) with varying compressive force. Composite samples were cross sectioned, polished and analysed using microscopy to investigate changes in architecture and crimp. Data from dry fabric compression and composite samples were then compared alongside the Wisetex models to determine accuracy of the prediction and identify architecture parameters that can affect the preform compressibility and stability. Results indicate that binder style/pick density, tow size and thickness have a significant effect on compressibility of 3D woven preforms with lower pick density allowing for greater compression and distortion of the architecture. It was further highlighted that binder style combined with pressure had a significant effect on changes to preform architecture where orthogonal binders experienced highest level of deformation, but highest overall stability, with compression while layer to layer indicated a reduction in fibre crimp of the binder. In general, simulations showed a relative comparison to experimental results; however, deviation is evident due to assumptions present within the modelled results.

Keywords: 3D woven composites, compression, preforms, textile composites

Procedia PDF Downloads 122
7623 Evaluation of Hydrogen Particle Volume on Surfaces of Selected Nanocarbons

Authors: M. Ziółkowska, J. T. Duda, J. Milewska-Duda

Abstract:

This paper describes an approach to the adsorption phenomena modeling aimed at specifying the adsorption mechanisms on localized or nonlocalized adsorbent sites, when applied to the nanocarbons. The concept comes from the fundamental thermodynamic description of adsorption equilibrium and is based on numerical calculations of the hydrogen adsorbed particles volume on the surface of selected nanocarbons: single-walled nanotube and nanocone. This approach enables to obtain information on adsorption mechanism and then as a consequence to take appropriate mathematical adsorption model, thus allowing for a more reliable identification of the material porous structure. Theoretical basis of the approach is discussed and newly derived results of the numerical calculations are presented for the selected nanocarbons.

Keywords: adsorption, mathematical modeling, nanocarbons, numerical analysis

Procedia PDF Downloads 254
7622 Dynamic Simulation of Disintegration of Wood Chips Caused by Impact and Collisions during the Steam Explosion Pre-Treatment

Authors: Muhammad Muzamal, Anders Rasmuson

Abstract:

Wood material is extensively considered as a raw material for the production of bio-polymers, bio-fuels and value-added chemicals. However, the shortcoming in using wood as raw material is that the enzymatic hydrolysis of wood material is difficult because the accessibility of enzymes to hemicelluloses and cellulose is hindered by complex chemical and physical structure of the wood. The steam explosion (SE) pre-treatment improves the digestion of wood material by creating both chemical and physical modifications in wood. In this process, first, wood chips are treated with steam at high pressure and temperature for a certain time in a steam treatment vessel. During this time, the chemical linkages between lignin and polysaccharides are cleaved and stiffness of material decreases. Then the steam discharge valve is rapidly opened and the steam and wood chips exit the vessel at very high speed. These fast moving wood chips collide with each other and with walls of the equipment and disintegrate to small pieces. More damaged and disintegrated wood have larger surface area and increased accessibility to hemicelluloses and cellulose. The energy required for an increase in specific surface area by same value is 70 % more in conventional mechanical technique, i.e. attrition mill as compared to steam explosion process. The mechanism of wood disintegration during the SE pre-treatment is very little studied. In this study, we have simulated collision and impact of wood chips (dimension 20 mm x 20 mm x 4 mm) with each other and with walls of the vessel. The wood chips are simulated as a 3D orthotropic material. Damage and fracture in the wood material have been modelled using 3D Hashin’s damage model. This has been accomplished by developing a user-defined subroutine and implementing it in the FE software ABAQUS. The elastic and strength properties used for simulation are of spruce wood at 12% and 30 % moisture content and at 20 and 160 OC because the impacted wood chips are pre-treated with steam at high temperature and pressure. We have simulated several cases to study the effects of elastic and strength properties of wood, velocity of moving chip and orientation of wood chip at the time of impact on the damage in the wood chips. The disintegration patterns captured by simulations are very similar to those observed in experimentally obtained steam exploded wood. Simulation results show that the wood chips moving with higher velocity disintegrate more. Moisture contents and temperature decreases elastic properties and increases damage. Impact and collision in specific directions cause easy disintegration. This model can be used to efficiently design the steam explosion equipment.

Keywords: dynamic simulation, disintegration of wood, impact, steam explosion pretreatment

Procedia PDF Downloads 391
7621 A Modified QuEChERS Method Using Activated Carbon Fibers as r-DSPE Sorbent for Sample Cleanup: Application to Pesticides Residues Analysis in Food Commodities Using GC-MS/MS

Authors: Anshuman Srivastava, Shiv Singh, Sheelendra Pratap Singh

Abstract:

A simple, sensitive and effective gas chromatography tandem mass spectrometry (GC-MS/MS) method was developed for simultaneous analysis of multi pesticide residues (organophosphate, organochlorines, synthetic pyrethroids and herbicides) in food commodities using phenolic resin based activated carbon fibers (ACFs) as reversed-dispersive solid phase extraction (r-DSPE) sorbent in modified QuEChERS (Quick Easy Cheap Effective Rugged Safe) method. The acetonitrile-based QuEChERS technique was used for the extraction of the analytes from food matrices followed by sample cleanup with ACFs instead of traditionally used primary secondary amine (PSA). Different physico-chemical characterization techniques such as Fourier transform infrared spectroscopy, scanning electron microscopy, X-ray diffraction and Brunauer-Emmet-Teller surface area analysis were employed to investigate the engineering and structural properties of ACFs. The recovery of pesticides and herbicides was tested at concentration levels of 0.02 and 0.2 mg/kg in different commodities such as cauliflower, cucumber, banana, apple, wheat and black gram. The recoveries of all twenty-six pesticides and herbicides were found in acceptable limit (70-120%) according to SANCO guideline with relative standard deviation value < 15%. The limit of detection and limit of quantification of the method was in the range of 0.38-3.69 ng/mL and 1.26 -12.19 ng/mL, respectively. In traditional QuEChERS method, PSA used as r-DSPE sorbent plays a vital role in sample clean-up process and demonstrates good recoveries for multiclass pesticides. This study reports that ACFs are better in terms of removal of co-extractives in comparison of PSA without compromising the recoveries of multi pesticides from food matrices. Further, ACF replaces the need of charcoal in addition to the PSA from traditional QuEChERS method which is used to remove pigments. The developed method will be cost effective because the ACFs are significantly cheaper than the PSA. So the proposed modified QuEChERS method is more robust, effective and has better sample cleanup efficiency for multiclass multi pesticide residues analysis in different food matrices such as vegetables, grains and fruits.

Keywords: QuEChERS, activated carbon fibers, primary secondary amine, pesticides, sample preparation, carbon nanomaterials

Procedia PDF Downloads 255
7620 Quick Similarity Measurement of Binary Images via Probabilistic Pixel Mapping

Authors: Adnan A. Y. Mustafa

Abstract:

In this paper we present a quick technique to measure the similarity between binary images. The technique is based on a probabilistic mapping approach and is fast because only a minute percentage of the image pixels need to be compared to measure the similarity, and not the whole image. We exploit the power of the Probabilistic Matching Model for Binary Images (PMMBI) to arrive at an estimate of the similarity. We show that the estimate is a good approximation of the actual value, and the quality of the estimate can be improved further with increased image mappings. Furthermore, the technique is image size invariant; the similarity between big images can be measured as fast as that for small images. Examples of trials conducted on real images are presented.

Keywords: big images, binary images, image matching, image similarity

Procedia PDF Downloads 182
7619 Bayesian Analysis of Change Point Problems Using Conditionally Specified Priors

Authors: Golnaz Shahtahmassebi, Jose Maria Sarabia

Abstract:

In this talk, we introduce a new class of conjugate prior distributions obtained from conditional specification methodology. We illustrate the application of such distribution in Bayesian change point detection in Poisson processes. We obtain the posterior distribution of model parameters using a general bivariate distribution with gamma conditionals. Simulation from the posterior is readily implemented using a Gibbs sampling algorithm. The Gibbs sampling is implemented even when using conditional densities that are incompatible or only compatible with an improper joint density. The application of such methods will be demonstrated using examples of simulated and real data.

Keywords: change point, bayesian inference, Gibbs sampler, conditional specification, gamma conditional distributions

Procedia PDF Downloads 178
7618 Accurate and Repeatable Pressure Control for Critical Testing of Advanced Ceramics Using Proportional and Derivative Controller

Authors: Benchalak Muangmeesri

Abstract:

The purpose of this paper is to discuss how to test the best control performance of a ceramics. Hydraulic press machine (HPM) is the most common shaping of advanced ceramic with products, dimensions, and ceramic products mainly from synthetic powders. A microcontroller can be achieved to control process and has set high standards in the shaping of raw materials in powder form. HPM was proposed to develop a position control system that linked to the embedded controller PIC16F877 via Proportional and Derivative (PD) controller. The model is performed using MATLAB/SIMULINK and the best control performance of an HPM. Finally, PD controller results, showing the best performance as it had the smallest overshoot and highest quality using a microcontroller control.

Keywords: ceramics, hydraulic press, microcontroller, PD controller

Procedia PDF Downloads 339
7617 Design of Torque Actuator in Hybrid Multi-DOF System with Taking into Account Magnetic Saturation

Authors: Hyun-Seok Hong, Tae-Chul Jeong, Huai-Cong Liu, Ju Lee

Abstract:

In this paper, proposes to replace the three-phase SPM for tilting by a single-phase torque actuator of the hybrid multi-DOF system. If a three-phase motor for tilting SPM as acting as instantaneous, low electricity use efficiency, controllability is bad disadvantages. It uses a single-phase torque actuator has a high electrical efficiency compared, good controllability. Thus this will have a great influence on the development and practical use of the system. This study designed a single phase torque actuator in consideration of the magnetic saturation. And compared the SPM and FEM analysis and validation through testing of the production model.

Keywords: hybrid multi-DOF system, SPM, torque actuator, UAV, drone

Procedia PDF Downloads 590
7616 Development of Instructional Material Using Scientific Approach to Make the Nature of Science (NOS) and Critical Thinking Explicit on Chemical Bonding and Intermolecular Forces Topics

Authors: Ivan Ashif Ardhana, Intan Mahanani

Abstract:

Chemistry education tends to change from triplet representation among macroscopic, microscopic, and symbolic to tetrahedron shape. This change set the aspect of human element on the top of learning. Meaning that students are expected to solve the problems involving the ethic, morality, and humanity through the class. Ability to solve the problems connecting either theories or applications is called scientific literacy which have been implemented in curriculum 2013 implicitly. Scientific literacy has an aspect of nature science and critical thinking. Both can be integrated to learning using scientific approach and scientific inquiry. Unfortunately, students’ ability of scientific literacy in Indonesia is far from expectation. A survey from PISA had proven it. Scientific literacy of Indonesian students is always at bottom five position from 2002 till 2012. Improving a scientific literacy needs many efforts against them. Developing an instructional material based on scientific approach is one kind of that efforts. Instructional material contains both aspect of nature of science and critical thinking which is instructed explicitly to improve the students’ understanding about science. Developing goal is to produce a prototype and an instructional material using scientific approach whose chapter is chemical bonding and intermolecular forces for high school students grade ten. As usual, the material is subjected to get either quantitative mark or suggestion through validation process using validation sheet instrument. Development model is adapted from 4D model containing four steps. They are define, design, develop, and disseminate. Nevertheless, development of instructional material had only done until third step. The final step wasn’t done because of time, cost, and energy limitations. Developed instructional material had been validated by four validators. They are coming from chemistry lecture and high school’s teacher which two at each. The result of this development research shown the average of quantitative mark of students’ book is 92.75% with very proper in criteria. Given at same validation process, teacher’s guiding book got the average mark by 96.98%, similar criteria with students’ book. Qualitative mark including both comments and suggestions resulted from validation process were used as consideration for the revision. The result concluded us how the instructional materials using scientific approach to explicit nature of science and critical thinking on the topic of chemical bonding and intermolecular forces are very proper if they are used at learning activity.

Keywords: critical thinking, instructional material, nature of science, scientific literacy

Procedia PDF Downloads 251
7615 New Iterative Algorithm for Improving Depth Resolution in Ionic Analysis: Effect of Iterations Number

Authors: N. Dahraoui, M. Boulakroune, D. Benatia

Abstract:

In this paper, the improvement by deconvolution of the depth resolution in Secondary Ion Mass Spectrometry (SIMS) analysis is considered. Indeed, we have developed a new Tikhonov-Miller deconvolution algorithm where a priori model of the solution is included. This is a denoisy and pre-deconvoluted signal obtained from: firstly, by the application of wavelet shrinkage algorithm, secondly by the introduction of the obtained denoisy signal in an iterative deconvolution algorithm. In particular, we have focused the light on the effect of the iterations number on the evolution of the deconvoluted signals. The SIMS profiles are multilayers of Boron in Silicon matrix.

Keywords: DRF, in-depth resolution, multiresolution deconvolution, SIMS, wavelet shrinkage

Procedia PDF Downloads 402
7614 An Analysis of the Recent Flood Scenario (2017) of the Southern Districts of the State of West Bengal, India

Authors: Soumita Banerjee

Abstract:

The State of West Bengal is mostly watered by innumerable rivers, and they are different in nature in both the northern and the southern part of the state. The southern part of West Bengal is mainly drained with the river Bhagirathi-Hooghly, and its major distributaries and tributaries have divided this major river basin into many subparts like the Ichamati-Bidyadhari, Pagla-Bansloi, Mayurakshi-Babla, Ajay, Damodar, Kangsabati Sub-basin to name a few. These rivers basically drain the Districts of Bankura, Burdwan, Hooghly, Nadia and Purulia, Birbhum, Midnapore, Murshidabad, North 24-Parganas, Kolkata, Howrah and South 24-Parganas. West Bengal has a huge number of flood-prone blocks in the southern part of the state of West Bengal, the responsible factors for flood situation are the shape and size of the catchment area, its steep gradient starting from plateau to flat terrain, the river bank erosion and its siltation, tidal condition especially in the lower Ganga Basin and very low maintenance of the embankments which are mostly used as communication links. Along with these factors, DVC (Damodar Valley Corporation) plays an important role in the generation (with the release of water) and controlling the flood situation. This year the whole Gangetic West Bengal is being flooded due to high intensity and long duration rainfall, and the release of water from the Durgapur Barrage As most of the rivers are interstate in nature at times floods also take place with release of water from the dams of the neighbouring states like Jharkhand. Other than Embankments, there is no such structural measures for combatting flood in West Bengal. This paper tries to analyse the reasons behind the flood situation this year especially with the help of climatic data collected from the Indian Metrological Department, flood related data from the Irrigation and Waterways Department, West Bengal and GPM (General Precipitation Measurement) data for rainfall analysis. Based on the threshold value derived from the calculation of the past available flood data, it is possible to predict the flood events which may occur in the near future and with the help of social media it can be spread out within a very short span of time to aware the mass. On a larger or a governmental scale, heightening the settlements situated on the either banks of the river can yield a better result than building up embankments.

Keywords: dam failure, embankments, flood, rainfall

Procedia PDF Downloads 210
7613 Mathematical Modeling of the Working Principle of Gravity Gradient Instrument

Authors: Danni Cong, Meiping Wu, Hua Mu, Xiaofeng He, Junxiang Lian, Juliang Cao, Shaokun Cai, Hao Qin

Abstract:

Gravity field is of great significance in geoscience, national economy and national security, and gravitational gradient measurement has been extensively studied due to its higher accuracy than gravity measurement. Gravity gradient sensor, being one of core devices of the gravity gradient instrument, plays a key role in measuring accuracy. Therefore, this paper starts from analyzing the working principle of the gravity gradient sensor by Newton’s law, and then considers the relative motion between inertial and non-inertial systems to build a relatively adequate mathematical model, laying a foundation for the measurement error calibration, measurement accuracy improvement.

Keywords: gravity gradient, gravity gradient sensor, accelerometer, single-axis rotation modulation

Procedia PDF Downloads 315
7612 Reflections from Participants and Researchers on a Trauma-Sensitive Yoga Program

Authors: Jessica Gladden

Abstract:

This study explored the perceived benefits of trauma-sensitive yoga programs. Participants attended one of two six-week trauma-sensitive yoga programs utilizing the G.R.A.C.E model, a format developed based on Emerson’s trauma-sensitive yoga guidelines and modified by the instructors. Participants in this study completed surveys on their experiences. The results of the surveys indicated that participants perceived improvements in self-care, embodiment, and mood. These results show that trauma-sensitive yoga may have benefits beyond the treatment of specific diagnoses that could be applied to a variety of populations. Reflections from one of the researchers who teaches in this program, as well as qualitative statements from the participants, will be shared to support the continued use of this method.

Keywords: yoga, trauma-sensitive, yoga therapy, trauma

Procedia PDF Downloads 144
7611 The Modification of Convolutional Neural Network in Fin Whale Identification

Authors: Jiahao Cui

Abstract:

In the past centuries, due to climate change and intense whaling, the global whale population has dramatically declined. Among the various whale species, the fin whale experienced the most drastic drop in number due to its popularity in whaling. Under this background, identifying fin whale calls could be immensely beneficial to the preservation of the species. This paper uses feature extraction to process the input audio signal, then a network based on AlexNet and three networks based on the ResNet model was constructed to classify fin whale calls. A mixture of the DOSITS database and the Watkins database was used during training. The results demonstrate that a modified ResNet network has the best performance considering precision and network complexity.

Keywords: convolutional neural network, ResNet, AlexNet, fin whale preservation, feature extraction

Procedia PDF Downloads 103