Search results for: two dimensional picture
693 Photoelastic Analysis and Finite Elements Analysis of a Stress Field Developed in a Double Edge Notched Specimen
Authors: A. Bilek, M. Beldi, T. Cherfi, S. Djebali, S. Larbi
Abstract:
Finite elements analysis and photoelasticity are used to determine the stress field developed in a double edge notched specimen loaded in tension. The specimen is cut in a birefringent plate. Experimental isochromatic fringes are obtained with circularly polarized light on the analyzer of a regular polariscope. The fringes represent the loci of points of equal maximum shear stress. In order to obtain the stress values corresponding to the fringe orders recorded in the notched specimen, particularly in the neighborhood of the notches, a calibrating disc made of the same material is loaded in compression along its diameter in order to determine the photoelastic fringe value. This fringe value is also used in the finite elements solution in order to obtain the simulated photoelastic fringes, the isochromatics as well as the isoclinics. A color scale is used by the software to represent the simulated fringes on the whole model. The stress concentration factor can be readily obtained at the notches. Good agreements are obtained between the experimental and the simulated fringe patterns and between the graphs of the shear stress particularly in the neighborhood of the notches. The purpose in this paper is to show that one can obtain rapidly and accurately, by the finite element analysis, the isochromatic and the isoclinic fringe patterns in a stressed model as the experimental procedure can be time consuming. Stress fields can therefore be analyzed in three dimensional models as long as the meshing and the limit conditions are properly set in the program.Keywords: isochromatic fringe, isoclinic fringe, photoelasticity, stress concentration factor
Procedia PDF Downloads 229692 Right Solution of Geodesic Equation in Schwarzschild Metric and Overall Examination of Physical Laws
Authors: Kwan U Kim, Jin Sim, Ryong Jin Jang, Sung Duk Kim
Abstract:
108 years have passed since a great number of physicists explained astronomical and physical phenomena by solving geodesic equations in the Schwarzschild metric. However, when solving the geodesic equations in Schwarzschild metric, they did not correctly solve one branch of the component of space among spatial and temporal components of four-dimensional force and did not come up with physical laws correctly by means of physical analysis from the results obtained by solving the geodesic equations. In addition, they did not treat the astronomical and physical phenomena in a physical way based on the correct physical laws obtained from the solution of the geodesic equations in the Schwarzschild metric. Therefore, some former scholars mentioned that Einstein’s theoretical basis of a general theory of relativity was obscure and incorrect, but they did not give a correct physical solution to the problems. Furthermore, since the general theory of relativity has not given a quantitative solution to obscure and incorrect problems, the generalization of gravitational theory has not yet been successfully completed, although former scholars have thought of it and tried to do it. In order to solve the problems, it is necessary to explore the obscure and incorrect problems in a general theory of relativity based on the physical laws and to find out the methodology for solving the problems. Therefore, as the first step toward achieving this purpose, the right solution of the geodesic equation in the Schwarzschild metric has been presented. Next, the correct physical laws found by making a physical analysis of the results have been presented, the obscure and incorrect problems have been shown, and an analysis of them has been made based on the physical laws. In addition, the experimental verification of the physical laws found by us has been made.Keywords: equivalence principle, general relativity, geometrodynamics, Schwarzschild, Poincaré
Procedia PDF Downloads 11691 Dosimetric Analysis of Intensity Modulated Radiotherapy versus 3D Conformal Radiotherapy in Adult Primary Brain Tumors: Regional Cancer Centre, India
Authors: Ravi Kiran Pothamsetty, Radha Rani Ghosh, Baby Paul Thaliath
Abstract:
Radiation therapy has undergone many advancements and evloved from 2D to 3D. Recently, with rapid pace of drug discoveries, cutting edge technology, and clinical trials has made innovative advancements in computer technology and treatment planning and upgraded to intensity modulated radiotherapy (IMRT) which delivers in homogenous dose to tumor and normal tissues. The present study was a hospital-based experience comparing two different conformal radiotherapy techniques for brain tumors. This analytical study design has been conducted at Regional Cancer Centre, India from January 2014 to January 2015. Ten patients have been selected after inclusion and exclusion criteria. All the patients were treated on Artiste Siemens Linac Accelerator. The tolerance level for maximum dose was 6.0 Gyfor lenses and 54.0 Gy for brain stem, optic chiasm and optical nerves as per RTOG criteria. Mean and standard deviation values of PTV98%, PTV 95% and PTV 2% in IMRT were 93.16±2.9, 95.01±3.4 and 103.1±1.1 respectively; for 3DCRT were 91.4±4.7, 94.17±2.6 and 102.7±0.39 respectively. PTV max dose (%) in IMRT and 3D-CRT were 104.7±0.96 and 103.9±1.0 respectively. Maximum dose to the tumor can be delivered with IMRT with acceptable toxicity limits. Variables such as expertise, location of tumor, patient condition, and TPS influence the outcome of the treatment.Keywords: brain tumors, intensity modulated radiotherapy (IMRT), three dimensional conformal radiotherapy (3D-CRT), radiation therapy oncology group (RTOG)
Procedia PDF Downloads 237690 Cone Beam Computed Tomography: A Useful Diagnostic Tool to Determine Root Canal Morphology in a Sample of Egyptian Population
Authors: H. El-Messiry, M. El-Zainy, D. Abdelkhalek
Abstract:
Cone-beam computed tomography (CBCT) provides high-quality 3-dimensional images of dental structures because of its high spatial resolution. The study of dental morphology is important in research as it provides information about diversities within a population. Many studies have shown different shapes and numbers of roots canals among different races, especially in molars. The aim of this study was to determine the morphology of root canals of mandibular first and third molars in a sample of Egyptian population using CBCT scanning. Fifty mandibular first Molars (M1) and fifty mandibular third (M3) extracted molars were collected. Thick rectangular molds were made using pink wax to hold the samples. Molars were embedded in the wax mold by aligning them in rows leaving arbitrary 0.5cm space between them. The molds with the samples in were submitted for CBCT scan. The number and morphology of root canals were assessed and classified according to Vertucci's classification. The mesial and the distal roots were examined separately. Finally, data was analyzed using Fisher exact test. The most prevalent mesial root canal frequency in M1 was type IV (60%) and type II (40 %), while M3 showed prevalence of type I (40%) and II (40%). Distal root canal morphology showed prevalence of type I in both M1 (66%) and M3 (86%). So, it can be concluded that CBCT scanning provides supplemental information about the root canal configurations of mandibular molars in a sample of Egyptian population. This study may help clinicians in the root canal treatment of mandibular molars.Keywords: cone beam computed tomography, mandibular first molar, mandibular third molar, root canal morphology
Procedia PDF Downloads 317689 A 7 Dimensional-Quantitative Structure-Activity Relationship Approach Combining Quantum Mechanics Based Grid and Solvation Models to Predict Hotspots and Kinetic Properties of Mutated Enzymes: An Enzyme Engineering Perspective
Authors: R. Pravin Kumar, L. Roopa
Abstract:
Enzymes are molecular machines used in various industries such as pharmaceuticals, cosmetics, food and animal feed, paper and leather processing, biofuel, and etc. Nevertheless, this has been possible only by the breath-taking efforts of the chemists and biologists to evolve/engineer these mysterious biomolecules to work the needful. Main agenda of this enzyme engineering project is to derive screening and selection tools to obtain focused libraries of enzyme variants with desired qualities. The methodologies for this research include the well-established directed evolution, rational redesign and relatively less established yet much faster and accurate insilico methods. This concept was initiated as a Receptor Rependent-4Dimensional Quantitative Structure Activity Relationship (RD-4D-QSAR) to predict kinetic properties of enzymes and extended here to study transaminase by a 7D QSAR approach. Induced-fit scenarios were explored using Quantum Mechanics/Molecular Mechanics (QM/MM) simulations which were then placed in a grid that stores interactions energies derived from QM parameters (QMgrid). In this study, the mutated enzymes were immersed completely inside the QMgrid and this was combined with solvation models to predict descriptors. After statistical screening of descriptors, QSAR models showed > 90% specificity and > 85% sensitivity towards the experimental activity. Mapping descriptors on the enzyme structure revealed hotspots important to enhance the enantioselectivity of the enzyme.Keywords: QMgrid, QM/MM simulations, RD-4D-QSAR, transaminase
Procedia PDF Downloads 135688 VISMA: A Method for System Analysis in Early Lifecycle Phases
Authors: Walter Sebron, Hans Tschürtz, Peter Krebs
Abstract:
The choice of applicable analysis methods in safety or systems engineering depends on the depth of knowledge about a system, and on the respective lifecycle phase. However, the analysis method chain still shows gaps as it should support system analysis during the lifecycle of a system from a rough concept in pre-project phase until end-of-life. This paper’s goal is to discuss an analysis method, the VISSE Shell Model Analysis (VISMA) method, which aims at closing the gap in the early system lifecycle phases, like the conceptual or pre-project phase, or the project start phase. It was originally developed to aid in the definition of the system boundary of electronic system parts, like e.g. a control unit for a pump motor. Furthermore, it can be also applied to non-electronic system parts. The VISMA method is a graphical sketch-like method that stratifies a system and its parts in inner and outer shells, like the layers of an onion. It analyses a system in a two-step approach, from the innermost to the outermost components followed by the reverse direction. To ensure a complete view of a system and its environment, the VISMA should be performed by (multifunctional) development teams. To introduce the method, a set of rules and guidelines has been defined in order to enable a proper shell build-up. In the first step, the innermost system, named system under consideration (SUC), is selected, which is the focus of the subsequent analysis. Then, its directly adjacent components, responsible for providing input to and receiving output from the SUC, are identified. These components are the content of the first shell around the SUC. Next, the input and output components to the components in the first shell are identified and form the second shell around the first one. Continuing this way, shell by shell is added with its respective parts until the border of the complete system (external border) is reached. Last, two external shells are added to complete the system view, the environment and the use case shell. This system view is also stored for future use. In the second step, the shells are examined in the reverse direction (outside to inside) in order to remove superfluous components or subsystems. Input chains to the SUC, as well as output chains from the SUC are described graphically via arrows, to highlight functional chains through the system. As a result, this method offers a clear and graphical description and overview of a system, its main parts and environment; however, the focus still remains on a specific SUC. It helps to identify the interfaces and interfacing components of the SUC, as well as important external interfaces of the overall system. It supports the identification of the first internal and external hazard causes and causal chains. Additionally, the method promotes a holistic picture and cross-functional understanding of a system, its contributing parts, internal relationships and possible dangers within a multidisciplinary development team.Keywords: analysis methods, functional safety, hazard identification, system and safety engineering, system boundary definition, system safety
Procedia PDF Downloads 223687 Central Finite Volume Methods Applied in Relativistic Magnetohydrodynamics: Applications in Disks and Jets
Authors: Raphael de Oliveira Garcia, Samuel Rocha de Oliveira
Abstract:
We have developed a new computer program in Fortran 90, in order to obtain numerical solutions of a system of Relativistic Magnetohydrodynamics partial differential equations with predetermined gravitation (GRMHD), capable of simulating the formation of relativistic jets from the accretion disk of matter up to his ejection. Initially we carried out a study on numerical methods of unidimensional Finite Volume, namely Lax-Friedrichs, Lax-Wendroff, Nessyahu-Tadmor method and Godunov methods dependent on Riemann problems, applied to equations Euler in order to verify their main features and make comparisons among those methods. It was then implemented the method of Finite Volume Centered of Nessyahu-Tadmor, a numerical schemes that has a formulation free and without dimensional separation of Riemann problem solvers, even in two or more spatial dimensions, at this point, already applied in equations GRMHD. Finally, the Nessyahu-Tadmor method was possible to obtain stable numerical solutions - without spurious oscillations or excessive dissipation - from the magnetized accretion disk process in rotation with respect to a central black hole (BH) Schwarzschild and immersed in a magnetosphere, for the ejection of matter in the form of jet over a distance of fourteen times the radius of the BH, a record in terms of astrophysical simulation of this kind. Also in our simulations, we managed to get substructures jets. A great advantage obtained was that, with the our code, we got simulate GRMHD equations in a simple personal computer.Keywords: finite volume methods, central schemes, fortran 90, relativistic astrophysics, jet
Procedia PDF Downloads 450686 Temperature Distribution for Asphalt Concrete-Concrete Composite Pavement
Authors: Tetsya Sok, Seong Jae Hong, Young Kyu Kim, Seung Woo Lee
Abstract:
The temperature distribution for asphalt concrete (AC)-Concrete composite pavement is one of main influencing factor that affects to performance life of pavement. The temperature gradient in concrete slab underneath the AC layer results the critical curling stress and lead to causes de-bonding of AC-Concrete interface. These stresses, when enhanced by repetitive axial loadings, also contribute to the fatigue damage and eventual crack development within the slab. Moreover, the temperature change within concrete slab extremely causes the slab contracts and expands that significantly induces reflective cracking in AC layer. In this paper, the numerical prediction of pavement temperature was investigated using one-dimensional finite different method (FDM) in fully explicit scheme. The numerical predicted model provides a fundamental and clear understanding of heat energy balance including incoming and outgoing thermal energies in addition to dissipated heat in the system. By using the reliable meteorological data for daily air temperature, solar radiation, wind speech and variable pavement surface properties, the predicted pavement temperature profile was validated with the field measured data. Additionally, the effects of AC thickness and daily air temperature on the temperature profile in underlying concrete were also investigated. Based on obtained results, the numerical predicted temperature of AC-Concrete composite pavement using FDM provided a good accuracy compared to field measured data and thicker AC layer significantly insulates the temperature distribution in underlying concrete slab.Keywords: asphalt concrete, finite different method (FDM), curling effect, heat transfer, solar radiation
Procedia PDF Downloads 267685 Mechanical Properties of Young and Senescence Fibroblast Cells Using Passive Microrheology
Authors: Samira Khalaji, , Fenneke Klein Jan, Kay-E. Gottschalk, Eugenia Makrantonaki, Karin Scharffetter-Kochanek
Abstract:
Biological aging is a multi-dimensional process that takes place over a whole range of scales from the nanoscopic alterations within individual cells, over transformations in tissues and organs and to changes of the whole organism. On the single cell level, aging involves mutation of genes, differences in gene expression levels as well as altered posttranslational modifications of proteins. A variety of proteins is affected, including proteins of the cell cytoskeleton and migration machinery. Previous work quantified the expression of cytoskeleton proteins on the gene and protein levels in senescent and young fibroblasts. Their results show that senescent skin fibroblasts have an upregulated expression of the intermediate filament (IF) protein vimentin in contrast to actin and tubulin, which are downregulated. IFs play an important role in providing mechanical stability of cells. However, the mechanical properties of IFs depending on cellular senescence or age of the donor has not been studied so far. Hence, we employed passive microrheology on primary human dermal fibroblasts from female donors with age of 28 years (young) and 86 years (old) as model of in vivo aging and human normal dermal fibroblast from 11-year old male with CPD 17-35 (young) and CPD 58-59 (senescence) as a model of in vitro replicative senescence. In contrast to the expectations, our primary results show no significant differences in the viscoelastic properties of fibroblasts depending on age of the donor or cellular replicative senescence.Keywords: aging, cytoskeleton, fibroblast, mechanical properties
Procedia PDF Downloads 318684 Prediction of Solanum Lycopersicum Genome Encoded microRNAs Targeting Tomato Spotted Wilt Virus
Authors: Muhammad Shahzad Iqbal, Zobia Sarwar, Salah-ud-Din
Abstract:
Tomato spotted wilt virus (TSWV) belongs to the genus Tospoviruses (family Bunyaviridae). It is one of the most devastating pathogens of tomato (Solanum Lycopersicum) and heavily damages the crop yield each year around the globe. In this study, we retrieved 329 mature miRNA sequences from two microRNA databases (miRBase and miRSoldb) and checked the putative target sites in the downloaded-genome sequence of TSWV. A consensus of three miRNA target prediction tools (RNA22, miRanda and psRNATarget) was used to screen the false-positive microRNAs targeting sites in the TSWV genome. These tools calculated different target sites by calculating minimum free energy (mfe), site-complementarity, minimum folding energy and other microRNA-mRNA binding factors. R language was used to plot the predicted target-site data. All the genes having possible target sites for different miRNAs were screened by building a consensus table. Out of these 329 mature miRNAs predicted by three algorithms, only eight miRNAs met all the criteria/threshold specifications. MC-Fold and MC-Sym were used to predict three-dimensional structures of miRNAs and further analyzed in USCF chimera to visualize the structural and conformational changes before and after microRNA-mRNA interactions. The results of the current study show that the predicted eight miRNAs could further be evaluated by in vitro experiments to develop TSWV-resistant transgenic tomato plants in the future.Keywords: tomato spotted wild virus (TSWV), Solanum lycopersicum, plant virus, miRNAs, microRNA target prediction, mRNA
Procedia PDF Downloads 154683 Quantum Mechanics as A Limiting Case of Relativistic Mechanics
Authors: Ahmad Almajid
Abstract:
The idea of unifying quantum mechanics with general relativity is still a dream for many researchers, as physics has only two paths, no more. Einstein's path, which is mainly based on particle mechanics, and the path of Paul Dirac and others, which is based on wave mechanics, the incompatibility of the two approaches is due to the radical difference in the initial assumptions and the mathematical nature of each approach. Logical thinking in modern physics leads us to two problems: - In quantum mechanics, despite its success, the problem of measurement and the problem of wave function interpretation is still obscure. - In special relativity, despite the success of the equivalence of rest-mass and energy, but at the speed of light, the fact that the energy becomes infinite is contrary to logic because the speed of light is not infinite, and the mass of the particle is not infinite too. These contradictions arise from the overlap of relativistic and quantum mechanics in the neighborhood of the speed of light, and in order to solve these problems, one must understand well how to move from relativistic mechanics to quantum mechanics, or rather, to unify them in a way different from Dirac's method, in order to go along with God or Nature, since, as Einstein said, "God doesn't play dice." From De Broglie's hypothesis about wave-particle duality, Léon Brillouin's definition of the new proper time was deduced, and thus the quantum Lorentz factor was obtained. Finally, using the Euler-Lagrange equation, we come up with new equations in quantum mechanics. In this paper, the two problems in modern physics mentioned above are solved; it can be said that this new approach to quantum mechanics will enable us to unify it with general relativity quite simply. If the experiments prove the validity of the results of this research, we will be able in the future to transport the matter at speed close to the speed of light. Finally, this research yielded three important results: 1- Lorentz quantum factor. 2- Planck energy is a limited case of Einstein energy. 3- Real quantum mechanics, in which new equations for quantum mechanics match and exceed Dirac's equations, these equations have been reached in a completely different way from Dirac's method. These equations show that quantum mechanics is a limited case of relativistic mechanics. At the Solvay Conference in 1927, the debate about quantum mechanics between Bohr, Einstein, and others reached its climax, while Bohr suggested that if particles are not observed, they are in a probabilistic state, then Einstein said his famous claim ("God does not play dice"). Thus, Einstein was right, especially when he didn't accept the principle of indeterminacy in quantum theory, although experiments support quantum mechanics. However, the results of our research indicate that God really does not play dice; when the electron disappears, it turns into amicable particles or an elastic medium, according to the above obvious equations. Likewise, Bohr was right also, when he indicated that there must be a science like quantum mechanics to monitor and study the motion of subatomic particles, but the picture in front of him was blurry and not clear, so he resorted to the probabilistic interpretation.Keywords: lorentz quantum factor, new, planck’s energy as a limiting case of einstein’s energy, real quantum mechanics, new equations for quantum mechanics
Procedia PDF Downloads 74682 Time-Domain Analysis Approaches of Soil-Structure Interaction: A Comparative Study
Authors: Abdelrahman Taha, Niloofar Malekghaini, Hamed Ebrahimian, Ramin Motamed
Abstract:
This paper compares the substructure and direct methods for soil-structure interaction (SSI) analysis in the time domain. In the substructure SSI method, the soil domain is replaced by a set of springs and dashpots, also referred to as the impedance function, derived through the study of the behavior of a massless rigid foundation. The impedance function is inherently frequency dependent, i.e., it varies as a function of the frequency content of the structural response. To use the frequency-dependent impedance function for time-domain SSI analysis, the impedance function is approximated at the fundamental frequency of the structure-soil system. To explore the potential limitations of the substructure modeling process, a two-dimensional reinforced concrete frame structure is modeled using substructure and direct methods in this study. The results show discrepancies between the simulated responses of the substructure and the direct approaches. To isolate the effects of higher modal responses, the same study is repeated using a harmonic input motion, in which a similar discrepancy is still observed between the substructure and direct approaches. It is concluded that the main source of discrepancy between the substructure and direct SSI approaches is likely attributed to the way the impedance functions are calculated, i.e., assuming a massless rigid foundation without considering the presence of the superstructure. Hence, a refined impedance function, considering the presence of the superstructure, shall be developed. This refined impedance function is expected to significantly improve the simulation accuracy of the substructure approach for structural systems whose behavior is dominated by the fundamental mode response.Keywords: direct approach, impedance function, soil-structure interaction, substructure approach
Procedia PDF Downloads 111681 Temperature Contour Detection of Salt Ice Using Color Thermal Image Segmentation Method
Authors: Azam Fazelpour, Saeed Reza Dehghani, Vlastimil Masek, Yuri S. Muzychka
Abstract:
The study uses a novel image analysis based on thermal imaging to detect temperature contours created on salt ice surface during transient phenomena. Thermal cameras detect objects by using their emissivities and IR radiance. The ice surface temperature is not uniform during transient processes. The temperature starts to increase from the boundary of ice towards the center of that. Thermal cameras are able to report temperature changes on the ice surface at every individual moment. Various contours, which show different temperature areas, appear on the ice surface picture captured by a thermal camera. Identifying the exact boundary of these contours is valuable to facilitate ice surface temperature analysis. Image processing techniques are used to extract each contour area precisely. In this study, several pictures are recorded while the temperature is increasing throughout the ice surface. Some pictures are selected to be processed by a specific time interval. An image segmentation method is applied to images to determine the contour areas. Color thermal images are used to exploit the main information. Red, green and blue elements of color images are investigated to find the best contour boundaries. The algorithms of image enhancement and noise removal are applied to images to obtain a high contrast and clear image. A novel edge detection algorithm based on differences in the color of the pixels is established to determine contour boundaries. In this method, the edges of the contours are obtained according to properties of red, blue and green image elements. The color image elements are assessed considering their information. Useful elements proceed to process and useless elements are removed from the process to reduce the consuming time. Neighbor pixels with close intensities are assigned in one contour and differences in intensities determine boundaries. The results are then verified by conducting experimental tests. An experimental setup is performed using ice samples and a thermal camera. To observe the created ice contour by the thermal camera, the samples, which are initially at -20° C, are contacted with a warmer surface. Pictures are captured for 20 seconds. The method is applied to five images ,which are captured at the time intervals of 5 seconds. The study shows the green image element carries no useful information; therefore, the boundary detection method is applied on red and blue image elements. In this case study, the results indicate that proposed algorithm shows the boundaries more effective than other edges detection methods such as Sobel and Canny. Comparison between the contour detection in this method and temperature analysis, which states real boundaries, shows a good agreement. This color image edge detection method is applicable to other similar cases according to their image properties.Keywords: color image processing, edge detection, ice contour boundary, salt ice, thermal image
Procedia PDF Downloads 312680 A Case Study on the Seismic Performance Assessment of the High-Rise Setback Tower Under Multiple Support Excitations on the Basis of TBI Guidelines
Authors: Kamyar Kildashti, Rasoul Mirghaderi
Abstract:
This paper describes the three-dimensional seismic performance assessment of a high-rise steel moment-frame setback tower, designed and detailed per the 2010 ASCE7, under multiple support excitations. The vulnerability analyses are conducted based on nonlinear history analyses under a set of multi-directional strong ground motion records which are scaled to design-based site-specific spectrum in accordance with ASCE41-13. Spatial variation of input motions between far distant supports of each part of the tower is considered by defining time lag. Plastic hinge monotonic and cyclic behavior for prequalified steel connections, panel zones, as well as steel columns is obtained from predefined values presented in TBI Guidelines, PEER/ATC72 and FEMA P440A to include stiffness and strength degradation. Inter-story drift ratios, residual drift ratios, as well as plastic hinge rotation demands under multiple support excitations, are compared to those obtained from uniform support excitations. Performance objectives based on acceptance criteria declared by TBI Guidelines are compared between uniform and multiple support excitations. The results demonstrate that input motion discrepancy results in detrimental effects on the local and global response of the tower.Keywords: high-rise building, nonlinear time history analysis, multiple support excitation, performance-based design
Procedia PDF Downloads 284679 Use of Two-Dimensional Hydraulics Modeling for Design of Erosion Remedy
Authors: Ayoub. El Bourtali, Abdessamed.Najine, Amrou Moussa. Benmoussa
Abstract:
One of the main goals of river engineering is river training, which is defined as controlling and predicting the behavior of a river. It is taking effective measurements to eliminate all related risks and thus improve the river system. In some rivers, the riverbed continues to erode and degrade; therefore, equilibrium will never be reached. Generally, river geometric characteristics and riverbed erosion analysis are some of the most complex but critical topics in river engineering and sediment hydraulics; riverbank erosion is the second answering process in hydrodynamics, which has a major impact on the ecological chain and socio-economic process. This study aims to integrate the new computer technology that can analyze erosion and hydraulic problems through computer simulation and modeling. Choosing the right model remains a difficult and sensitive job for field engineers. This paper makes use of the 5.0.4 version of the HEC-RAS model. The river section is adopted according to the gauged station and the proximity of the adjustment. In this work, we will demonstrate how 2D hydraulic modeling helped clarify the design and cover visuals to set up depth and velocities at riverbanks and throughout advanced structures. The hydrologic engineering center's-river analysis system (HEC-RAS) 2D model was used to create a hydraulic study of the erosion model. The geometric data were generated from the 12.5-meter x 12.5-meter resolution digital elevation model. In addition to showing eroded or overturned river sections, the model output also shows patterns of riverbank changes, which can help us reduce problems caused by erosion.Keywords: 2D hydraulics model, erosion, floodplain, hydrodynamic, HEC-RAS, riverbed erosion, river morphology, resolution digital data, sediment
Procedia PDF Downloads 188678 Virtual Reality as a Method in Transformative Learning: A Strategy to Reduce Implicit Bias
Authors: Cory A. Logston
Abstract:
It is imperative researchers continue to explore every transformative strategy to increase empathy and awareness of racial bias. Racism is a social and political concept that uses stereotypical ideology to highlight racial inequities. Everyone has biases they may not be aware of toward disparate out-groups. There is some form of racism in every profession; doctors, lawyers, and teachers are not immune. There have been numerous successful and unsuccessful strategies to motivate and transform an individual’s unconscious biased attitudes. One method designed to induce a transformative experience and identify implicit bias is virtual reality (VR). VR is a technology designed to transport the user to a three-dimensional environment. In a virtual reality simulation, the viewer is immersed in a realistic interactive video taking on the perspective of a Black man. The viewer as the character experiences discrimination in various life circumstances growing up as a child into adulthood. For instance, the prejudice felt in school, as an adolescent encountering the police and false accusations in the workplace. Current research suggests that an immersive VR simulation can enhance self-awareness and become a transformative learning experience. This study uses virtual reality immersion and transformative learning theory to create empathy and identify any unintentional racial bias. Participants, White teachers, will experience a VR immersion to create awareness and identify implicit biases regarding Black students. The desired outcome provides a springboard to reconceptualize their own implicit bias. Virtual reality is gaining traction in the research world and promises to be an effective tool in the transformative learning process.Keywords: empathy, implicit bias, transformative learning, virtual reality
Procedia PDF Downloads 193677 Teachers’ Instructional Decisions When Teaching Geometric Transformations
Authors: Lisa Kasmer
Abstract:
Teachers’ instructional decisions shape the structure and content of mathematics lessons and influence the mathematics that students are given the opportunity to learn. Therefore, it is important to better understand how teachers make instructional decisions and thus find new ways to help practicing and future teachers give their students a more effective and robust learning experience. Understanding the relationship between teachers’ instructional decisions and their goals, resources, and orientations (beliefs) is important given the heightened focus on geometric transformations in the middle school mathematics curriculum. This work is significant as the development and support of current and future teachers need more effective ways to teach geometry to their students. The following research questions frame this study: (1) As middle school mathematics teachers plan and enact instruction related to teaching transformations, what thinking processes do they engage in to make decisions about teaching transformations with or without a coordinate system and (2) How do the goals, resources and orientations of these teachers impact their instructional decisions and reveal about their understanding of teaching transformations? Teachers and students alike struggle with understanding transformations; many teachers skip or hurriedly teach transformations at the end of the school year. However, transformations are an important mathematical topic as this topic supports students’ understanding of geometric and spatial reasoning. Geometric transformations are a foundational concept in mathematics, not only for understanding congruence and similarity but for proofs, algebraic functions, and calculus etc. Geometric transformations also underpin the secondary mathematics curriculum, as features of transformations transfer to other areas of mathematics. Teachers’ instructional decisions in terms of goals, orientations, and resources that support these instructional decisions were analyzed using open-coding. Open-coding is recognized as an initial first step in qualitative analysis, where comparisons are made, and preliminary categories are considered. Initial codes and categories from current research on teachers’ thinking processes that are related to the decisions they make while planning and reflecting on the lessons were also noted. Surfacing ideas and additional themes common across teachers while seeking patterns, were compared and analyzed. Finally, attributes of teachers’ goals, orientations and resources were identified in order to begin to build a picture of the reasoning behind their instructional decisions. These categories became the basis for the organization and conceptualization of the data. Preliminary results suggest that teachers often rely on their own orientations about teaching geometric transformations. These beliefs are underpinned by the teachers’ own mathematical knowledge related to teaching transformations. When a teacher does not have a robust understanding of transformations, they are limited by this lack of knowledge. These shortcomings impact students’ opportunities to learn, and thus disadvantage their own understanding of transformations. Teachers’ goals are also limited by their paucity of knowledge regarding transformations, as these goals do not fully represent the range of comprehension a teacher needs to teach this topic well.Keywords: coordinate plane, geometric transformations, instructional decisions, middle school mathematics
Procedia PDF Downloads 87676 Construction of Submerged Aquatic Vegetation Index through Global Sensitivity Analysis of Radiative Transfer Model
Authors: Guanhua Zhou, Zhongqi Ma
Abstract:
Submerged aquatic vegetation (SAV) in wetlands can absorb nitrogen and phosphorus effectively to prevent the eutrophication of water. It is feasible to monitor the distribution of SAV through remote sensing, but for the reason of weak vegetation signals affected by water body, traditional terrestrial vegetation indices are not applicable. This paper aims at constructing SAV index to enhance the vegetation signals and distinguish SAV from water body. The methodology is as follows: (1) select the bands sensitive to the vegetation parameters based on global sensitivity analysis of SAV canopy radiative transfer model; (2) take the soil line concept as reference, analyze the distribution of SAV and water reflectance simulated by SAV canopy model and semi-analytical water model in the two-dimensional space built by different sensitive bands; (3)select the band combinations which have better separation performance between SAV and water, and use them to build the SAVI indices in the form of normalized difference vegetation index(NDVI); (4)analyze the sensitivity of indices to the water and vegetation parameters, choose the one more sensitive to vegetation parameters. It is proved that index formed of the bands with central wavelengths in 705nm and 842nm has high sensitivity to chlorophyll content in leaves while it is less affected by water constituents. The model simulation shows a general negative, little correlation of SAV index with increasing water depth. Moreover, the index enhances capabilities in separating SAV from water compared to NDVI. The SAV index is expected to have potential in parameter inversion of wetland remote sensing.Keywords: global sensitivity analysis, radiative transfer model, submerged aquatic vegetation, vegetation indices
Procedia PDF Downloads 261675 Poland and the Dawn of the Right to Education and Development: Moving Back in Time
Authors: Magdalena Zabrocka
Abstract:
The terror of women throughout the governance of the current populist ruling party in Poland, PiS, has been a subject of a heated debate alongside the issues of minorities’ rights, the rule of law, and democracy in the country. The challenges that women and other vulnerable groups are currently facing, however, come down to more than just a lack of comprehensive equality laws, severely limited reproductive rights, hateful slogans, and messages propagated by the central authority and its sympathisers, or a common disregard for women’s fundamental rights. Many sources and media reports are available only in Polish, while international rapporteurs fail to acknowledge the whole picture of the tragedy happening in the country and the variety of factors affecting it. Starting with the authorities’ and Polish catholic church’s propaganda concerning CEDAW and the Istanbul Convention Action against Violence against Women and Domestic Violence by spreading strategic disinformation that it codifies ‘gender ideology’ and ‘anti-Christian values’ in order to convince the electorate that the legal instruments should be ‘abandoned’. Alongside severely restricted abortion rights, bullying medical professionals helping women exercise their reproductive rights, violating women’s privacy by introducing a mandatory registry of pregnancies (so that one’s pregnancy or its ‘loss’ can be tracked and traced), restricting access to the ‘day after pill’ and real sex education at schools (most schools have a subject of ‘knowledge of living in a family’), introducing prison punishment for teachers accused of spreading ‘sex education’, and many other, the current tyrant government, has now decided to target the youngest with its misinformation and indoctrination, via strategically designed textbooks and curriculum. Biology books have seen a big restriction on the size of the chapters devoted to evolution, reproductive system, and sexual health. Approved religion books (which are taught 2-3 times a week as compared to 1 a week sciences) now cover false information about Darwin’s theory and arguments ‘against it’. Most recently, however, the public spoke up against the absurd messages contained in the politically rewritten history books, where the material about some figures not liked by the governing party has already been manipulated. In the recently approved changes to the history textbook, one can find a variety of strongly biased and politically-charged views representative of the conservatives in the states, most notably, equating the ‘gender ideology’ and feminism with Nazism. Thus, this work, by employing a human rights approach, would focus on the right to education and development as well as the considerate obstacles to access to scientific information by the youth.Keywords: Poland, right to education, right to development, authoritarianism, access to information
Procedia PDF Downloads 104674 Evaluation of Heat Transfer and Entropy Generation by Al2O3-Water Nanofluid
Authors: Houda Jalali, Hassan Abbassi
Abstract:
In this numerical work, natural convection and entropy generation of Al2O3–water nanofluid in square cavity have been studied. A two-dimensional steady laminar natural convection in a differentially heated square cavity of length L, filled with a nanofluid is investigated numerically. The horizontal walls are considered adiabatic. Vertical walls corresponding to x=0 and x=L are respectively maintained at hot temperature, Th and cold temperature, Tc. The resolution is performed by the CFD code "FLUENT" in combination with GAMBIT as mesh generator. These simulations are performed by maintaining the Rayleigh numbers varied as 103 ≤ Ra ≤ 106, while the solid volume fraction varied from 1% to 5%, the particle size is fixed at dp=33 nm and a range of the temperature from 20 to 70 °C. We used models of thermophysical nanofluids properties based on experimental measurements for studying the effect of adding solid particle into water in natural convection heat transfer and entropy generation of nanofluid. Such as models of thermal conductivity and dynamic viscosity which are dependent on solid volume fraction, particle size and temperature. The average Nusselt number is calculated at the hot wall of the cavity in a different solid volume fraction. The most important results is that at low temperatures (less than 40 °C), the addition of nanosolids Al2O3 into water leads to a decrease in heat transfer and entropy generation instead of the expected increase, whereas at high temperature, heat transfer and entropy generation increase with the addition of nanosolids. This behavior is due to the contradictory effects of viscosity and thermal conductivity of the nanofluid. These effects are discussed in this work.Keywords: entropy generation, heat transfer, nanofluid, natural convection
Procedia PDF Downloads 276673 Orthogonal Metal Cutting Simulation of Steel AISI 1045 via Smoothed Particle Hydrodynamic Method
Authors: Seyed Hamed Hashemi Sohi, Gerald Jo Denoga
Abstract:
Machining or metal cutting is one of the most widely used production processes in industry. The quality of the process and the resulting machined product depends on parameters like tool geometry, material, and cutting conditions. However, the relationships of these parameters to the cutting process are often based mostly on empirical knowledge. In this study, computer modeling and simulation using LS-DYNA software and a Smoothed Particle Hydrodynamic (SPH) methodology, was performed on the orthogonal metal cutting process to analyze three-dimensional deformation of AISI 1045 medium carbon steel during machining. The simulation was performed using the following constitutive models: the Power Law model, the Johnson-Cook model, and the Zerilli-Armstrong models (Z-A). The outcomes were compared against the simulated results obtained by Cenk Kiliçaslan using the Finite Element Method (FEM) and the empirical results of Jaspers and Filice. The analysis shows that the SPH method combined with the Zerilli-Armstrong constitutive model is a viable alternative to simulating the metal cutting process. The tangential force was overestimated by 7%, and the normal force was underestimated by 16% when compared with empirical values. The simulation values for flow stress versus strain at various temperatures were also validated against empirical values. The SPH method using the Z-A model has also proven to be robust against issues of time-scaling. Experimental work was also done to investigate the effects of friction, rake angle and tool tip radius on the simulation.Keywords: metal cutting, smoothed particle hydrodynamics, constitutive models, experimental, cutting forces analyses
Procedia PDF Downloads 259672 Polymer Nanostructures Based Catalytic Materials for Energy and Environmental Applications
Authors: S. Ghosh, L. Ramos, A. N. Kouamé, A.-L. Teillout, H. Remita
Abstract:
Catalytic materials have attracted continuous attention due to their promising applications in a variety of energy and environmental applications including clean energy, energy conversion and storage, purification and separation, degradation of pollutants and electrochemical reactions etc. With the advanced synthetic technologies, polymer nanostructures and nanocomposites can be directly synthesized through soft template mediated approach using swollen hexagonal mesophases and modulate the size, morphology, and structure of polymer nanostructures. As an alternative to conventional catalytic materials, one-dimensional PDPB polymer nanostructures shows high photocatalytic activity under visible light for the degradation of pollutants. These photocatalysts are very stable with cycling. Transmission electron microscopy (TEM), and AFM-IR characterizations reveal that the morphology and structure of the polymer nanostructures do not change after photocatalysis. These stable and cheap polymer nanofibers and metal polymer nanocomposites are easy to process and can be reused without appreciable loss of activity. The polymer nanocomposites formed via one pot chemical redox reaction with 3.4 nm Pd nanoparticles on poly(diphenylbutadiyne) (PDPB) nanofibers (30 nm). The reduction of Pd (II) ions is accompanied by oxidative polymerization leading to composites materials. Hybrid Pd/PDPB nanocomposites used as electrode materials for the electrocatalytic oxidation of ethanol without using support of proton exchange Nafion membrane. Hence, these conducting polymer nanofibers and nanocomposites offer the perspective of developing a new generation of efficient photocatalysts for environmental protection and in electrocatalysis for fuel cell applications.Keywords: conducting polymer, swollen hexagonal mesophases, solar photocatalysis, electrocatalysis, water depollution
Procedia PDF Downloads 382671 Migrant Labour in Kerala: A Study on Inter-State Migrant Workers
Authors: Arun Perumbilavil Anand
Abstract:
In the recent years, Kerala is witnessing a large inflow of migrants from different parts of the country. Though initially, the migrants were largely from the districts of Tamil Nadu and mostly of seasonal nature, but at a later period, the state started getting migrants from the far-off states like UP, Assam, Bengal, etc. Higher wages for unskilled labour, large opportunities for employment, the reluctance on the part of Kerala workers to do menial and hard physical work, and the shortage of local labour, paradoxically despite the high unemployment rate in the state, led to the massive influx of migrant labourers. This study takes a multi-dimensional overview of migrant labour in Kerala by encompassing factors such as channels of migration, nature of employment contracts entered into and the corresponding wages and benefits obtained by them. The study also analysed the circumstances that led to the large influx of migrants from different states of India. It further makes an attempt to examine the varying dimensions of living and working environment, and also the health conditions of migrants. The study is based on the empirical findings obtained as a result of the primary interviews conducted with migrants in the districts of Palakkad, Malappuram, and Ernakulam. The study concludes by noting that Kerala will inevitably have to depend on migrant labour and is likely to experience heavy in-migration of labour in future, provided that if the existing socioeconomic and demographic situations persist. Since, this is inevitable, the best way before the state is to prepare well in advance to receive and accommodate such migrant labour to lead a comfortable life in a hassle free environment, so that it would definitely play a vital role in further strengthening and sustaining the growth trajectory of not only Kerala’s economy but also the states of origin.Keywords: Kerala, labour, migration, migrant workers
Procedia PDF Downloads 252670 Monitoring and Evaluation of Web-Services Quality and Medium-Term Impact on E-Government Agencies' Efficiency
Authors: A. F. Huseynov, N. T. Mardanov, J. Y. Nakhchivanski
Abstract:
This practical research is aimed to improve the management quality and efficiency of public administration agencies providing e-services. The monitoring system developed will provide continuous review of the websites compliance with the selected indicators, their evaluation based on the selected indicators and ranking of services according to the quality criteria. The responsible departments in the government agencies were surveyed; the questionnaire includes issues of management and feedback, e-services provided, and the application of information systems. By analyzing the main affecting factors and barriers, the recommendations will be given that lead to the relevant decisions to strengthen the state agencies competencies for the management and the provision of their services. Component 1. E-services monitoring system. Three separate monitoring activities are proposed to be executed in parallel: Continuous tracing of e-government sites using built-in web-monitoring program; this program generates several quantitative values which are basically related to the technical characteristics and the performance of websites. The expert assessment of e-government sites in accordance with the two general criteria. Criterion 1. Technical quality of the site. Criterion 2. Usability/accessibility (load, see, use). Each high-level criterion is in turn subdivided into several sub-criteria, such as: the fonts and the color of the background (Is it readable?), W3C coding standards, availability of the Robots.txt and the site map, the search engine, the feedback/contact and the security mechanisms. The on-line survey of the users/citizens – a small group of questions embedded in the e-service websites. The questionnaires comprise of the information concerning navigation, users’ experience with the website (whether it was positive or negative), etc. Automated monitoring of web-sites by its own could not capture the whole evaluation process, and should therefore be seen as a complement to expert’s manual web evaluations. All of the separate results were integrated to provide the complete evaluation picture. Component 2. Assessment of the agencies/departments efficiency in providing e-government services. - the relevant indicators to evaluate the efficiency and the effectiveness of e-services were identified; - the survey was conducted in all the governmental organizations (ministries, committees and agencies) that provide electronic services for the citizens or the businesses; - the quantitative and qualitative measures are covering the following sections of activities: e-governance, e-services, the feedback from the users, the information systems at the agencies’ disposal. Main results: 1. The software program and the set of indicators for internet sites evaluation has been developed and the results of pilot monitoring have been presented. 2. The evaluation of the (internal) efficiency of the e-government agencies based on the survey results with the practical recommendations related to the human potential, the information systems used and e-services provided.Keywords: e-government, web-sites monitoring, survey, internal efficiency
Procedia PDF Downloads 304669 Full-Face Hyaluronic Acid Implants Assisted by Artificial Intelligence-Generated Post-treatment 3D Models
Authors: Ciro Cursio, Pio Luigi Cursio, Giulia Cursio, Isabella Chiardi, Luigi Cursio
Abstract:
Introduction: Full-face aesthetic treatments often present a difficult task: since different patients possess different anatomical and tissue characteristics, there is no guarantee that the same treatment will have the same effect on multiple patients; additionally, full-face rejuvenation and beautification treatments require not only a high degree of technical skill but also the ability to choose the right product for each area and a keen artistic eye. Method: We present an artificial intelligence-based algorithm that can generate realistic post-treatment 3D models based on the patient’s requests together with the doctor’s input. These 3-dimensional predictions can be used by the practitioner for two purposes: firstly, they help ensure that the patient and the doctor are completely aligned on the expectations of the treatment; secondly, the doctor can use them as a visual guide, obtaining a natural result that would normally stem from the practitioner's artistic skills. To this end, the algorithm is able to predict injection zones, the type and quantity of hyaluronic acid, the injection depth, and the technique to use. Results: Our innovation consists in providing an objective visual representation of the patient that is helpful in the patient-doctor dialogue. The patient, based on this information, can express her desire to undergo a specific treatment or make changes to the therapeutic plan. In short, the patient becomes an active agent in the choices made before the treatment. Conclusion: We believe that this algorithm will reveal itself as a useful tool in the pre-treatment decision-making process to prevent both the patient and the doctor from making a leap into the dark.Keywords: hyaluronic acid, fillers, full face, artificial intelligence, 3D
Procedia PDF Downloads 88668 Disaster Capitalism, Charter Schools, and the Reproduction of Inequality in Poor, Disabled Students: An Ethnographic Case Study
Authors: Sylvia Mac
Abstract:
This ethnographic case study examines disaster capitalism, neoliberal market-based school reforms, and disability through the lens of Disability Studies in Education. More specifically, it explores neoliberalism and special education at a small, urban charter school in a large city in California and the (re)production of social inequality. The study uses Sociology of Special Education to examine the ways in which special education is used to sort and stratify disabled students. At a time when rhetoric surrounding public schools is framed in catastrophic and dismal language in order to justify the privatization of public education, small urban charter schools must be examined to learn if they are living up to their promise or acting as another way to maintain economic and racial segregation. The study concludes that neoliberal contexts threaten successful inclusive education and normalize poor, disabled students’ continued low achievement and poor post-secondary outcomes. This ethnographic case study took place at a small urban charter school in a large city in California. Participants included three special education students, the special education teacher, the special education assistant, a regular education teacher, and the two founders and charter writers. The school claimed to have a push-in model of special education where all special education students were fully included in the general education classroom. Although presented as fully inclusive, some special education students also attended a pull-out class called Study Skills. The study found that inclusion and neoliberalism are differing ideologies that cannot co-exist. Successful inclusive environments cannot thrive while under the influences of neoliberal education policies such as efficiency and cost-cutting. Additionally, the push for students to join the global knowledge economy means that more and more low attainers are further marginalized and kept in poverty. At this school, neoliberal ideology eclipsed the promise of inclusive education for special education students. This case study has shown the need for inclusive education to be interrogated through lenses that consider macro factors, such as neoliberal ideology in public education, as well as the emerging global knowledge economy and increasing income inequality. Barriers to inclusion inside the school, such as teachers’ attitudes, teacher preparedness, and school infrastructure paint only part of the picture. Inclusive education is also threatened by neoliberal ideology that shifts the responsibility from the state to the individual. This ideology is dangerous because it reifies the stereotypes of disabled students as lazy, needs drains on already dwindling budgets. If these stereotypes persist, inclusive education will have a difficult time succeeding. In order to more fully examine the ways in which inclusive education can become truly emancipatory, we need more analysis on the relationship between neoliberalism, disability, and special education.Keywords: case study, disaster capitalism, inclusive education, neoliberalism
Procedia PDF Downloads 219667 A Comparative Study of the Effects of Vibratory Stress Relief and Thermal Aging on the Residual Stress of Explosives Materials
Authors: Xuemei Yang, Xin Sun, Cheng Fu, Qiong Lan, Chao Han
Abstract:
Residual stresses, which can be produced during the manufacturing process of plastic bonded explosive (PBX), play an important role in weapon system security and reliability. Residual stresses can and do change in service. This paper mainly studies the influence of vibratory stress relief (VSR) and thermal aging on residual stress of explosives. Firstly, the residual stress relaxation of PBX via different physical condition of VSR, such as vibration time, amplitude and dynamic strain, were studied by drill-hole technique. The result indicated that the vibratory amplitude, time and dynamic strain had a significant influence on the residual stress relief of PBX. The rate of residual stress relief of PBX increases first and then decreases with the increase of dynamic strain, amplitude and time, because the activation energy is too small to make the PBX yield plastic deformation at first. Then the dynamic strain, time and amplitude exceed a certain threshold, the residual stress changes show the same rule and decrease sharply, this sharply drop of residual stress relief rate may have been caused by over vibration. Meanwhile, the comparison between VSR and thermal aging was also studied. The conclusion is that the reduction ratio of residual stress after VSR process with applicable vibratory parameters could be equivalent to 73% of thermal aging with 7 days. In addition, the density attenuation rate, mechanical property, and dimensional stability with 3 months after VSR process was almost the same compared with thermal aging. However, compared with traditional thermal aging, VSR only takes a very short time, which greatly improves the efficiency of aging treatment for explosive materials. Therefore, the VSR could be a potential alternative technique in the industry of residual stress relaxation of PBX explosives.Keywords: explosives, residual stresses, thermal aging, vibratory stress relief, VSR
Procedia PDF Downloads 157666 Minimizing Vehicular Traffic via Integrated Land Use Development: A Heuristic Optimization Approach
Authors: Babu Veeregowda, Rongfang Liu
Abstract:
The current traffic impact assessment methodology and environmental quality review process for approval of land development project are conventional, stagnant, and one-dimensional. The environmental review policy and procedure lacks in providing the direction to regulate or seek alternative land uses and sizes that exploits the existing or surrounding elements of built environment (‘4 D’s’ of development – Density, Diversity, Design, and Distance to Transit) or smart growth principles which influence the travel behavior and have a significant effect in reducing vehicular traffic. Additionally, environmental review policy does not give directions on how to incorporate urban planning into the development in ways such as incorporating non-motorized roadway elements such as sidewalks, bus shelters, and access to community facilities. This research developed a methodology to optimize the mix of land uses and sizes using the heuristic optimization process to minimize the auto dependency development and to meet the interests of key stakeholders. A case study of Willets Point Mixed Use Development in Queens, New York, was used to assess the benefits of the methodology. The approved Willets Point Mixed Use project was based on maximum envelop of size and land use type allowed by current conventional urban renewal plans. This paper will also evaluate the parking accumulation for various land uses to explore the potential for shared parking to further optimize the mix of land uses and sizes. This research is very timely and useful to many stakeholders interested in understanding the benefits of integrated land uses and its development.Keywords: traffic impact, mixed use, optimization, trip generation
Procedia PDF Downloads 211665 Microbial Degradation of Lignin for Production of Valuable Chemicals
Authors: Fnu Asina, Ivana Brzonova, Keith Voeller, Yun Ji, Alena Kubatova, Evguenii Kozliak
Abstract:
Lignin, a heterogeneous three-dimensional biopolymer, is one of the building blocks of lignocellulosic biomass. Due to its limited chemical reactivity, lignin is currently processed as a low-value by-product in pulp and paper mills. Among various industrial lignins, Kraft lignin represents a major source of by-products generated during the widely employed pulping process across the pulp and paper industry. Therefore, valorization of Kraft lignin holds great potential as this would provide a readily available source of aromatic compounds for various industrial applications. Microbial degradation is well known for using both highly specific ligninolytic enzymes secreted by microorganisms and mild operating conditions compared with conventional chemical approaches. In this study, the degradation of Indulin AT lignin was assessed by comparing the effects of Basidiomycetous fungi (Coriolus versicolour and Trametes gallica) and Actinobacteria (Mycobacterium sp. and Streptomyces sp.) to two commercial laccases, T. versicolour ( ≥ 10 U/mg) and C. versicolour ( ≥ 0.3 U/mg). After 54 days of cultivation, the extent of microbial degradation was significantly higher than that of commercial laccases, reaching a maximum of 38 wt% degradation for C. versicolour treated samples. Lignin degradation was further confirmed by thermal carbon analysis with a five-step temperature protocol. Compared with commercial laccases, a significant decrease in char formation at 850ºC was observed among all microbial-degraded lignins with a corresponding carbon percentage increase from 200ºC to 500ºC. To complement the carbon analysis result, chemical characterization of the degraded products at different stages of the delignification by microorganisms and commercial laccases was performed by Pyrolysis-GC-MS.Keywords: lignin, microbial degradation, pyrolysis-GC-MS, thermal carbon analysis
Procedia PDF Downloads 410664 Numerical Analysis for Soil Compaction and Plastic Points Extension in Pile Drivability
Authors: Omid Tavasoli, Mahmoud Ghazavi
Abstract:
A numerical analysis of drivability of piles in different geometry is presented. In this paper, a three-dimensional finite difference analysis for plastic point extension and soil compaction in the effect of pile driving is analyzed. Four pile configurations such as cylindrical pile, fully tapered pile, T-C pile consists of a top tapered segment and a lower cylindrical segment and C-T pile has a top cylindrical part followed by a tapered part are investigated. All piles which driven up to a total penetration depth of 16 m have the same length with equivalent surface area and approximately with identical material volumes. An idealization for pile-soil system in pile driving is considered for this approach. A linear elastic material is assumed to model the vertical pile behaviors and the soil obeys the elasto-plastic constitutive low and its failure is controlled by the Mohr-Coulomb failure criterion. A slip which occurred at the pile-soil contact surfaces along the shaft and the toe in pile driving procedures is simulated with interface elements. All initial and boundary conditions are the same in all analyses. Quiet boundaries are used to prevent wave reflection in the lateral and vertical directions for the soil. The results obtained from numerical analyses were compared with available other numerical data and laboratory tests, indicating a satisfactory agreement. It will be shown that with increasing the angle of taper, the permanent piles toe settlement increase and therefore, the extension of plastic points increase. These are interesting phenomena in pile driving and are on the safe side for driven piles.Keywords: pile driving, finite difference method, non-uniform piles, pile geometry, pile set, plastic points, soil compaction
Procedia PDF Downloads 482