Search results for: place and space
2064 Ground Surface Temperature History Prediction Using Long-Short Term Memory Neural Network Architecture
Authors: Venkat S. Somayajula
Abstract:
Ground surface temperature history prediction model plays a vital role in determining standards for international nuclear waste management. International standards for borehole based nuclear waste disposal require paleoclimate cycle predictions on scale of a million forward years for the place of waste disposal. This research focuses on developing a paleoclimate cycle prediction model using Bayesian long-short term memory (LSTM) neural architecture operated on accumulated borehole temperature history data. Bayesian models have been previously used for paleoclimate cycle prediction based on Monte-Carlo weight method, but due to limitations pertaining model coupling with certain other prediction networks, Bayesian models in past couldn’t accommodate prediction cycle’s over 1000 years. LSTM has provided frontier to couple developed models with other prediction networks with ease. Paleoclimate cycle developed using this process will be trained on existing borehole data and then will be coupled to surface temperature history prediction networks which give endpoints for backpropagation of LSTM network and optimize the cycle of prediction for larger prediction time scales. Trained LSTM will be tested on past data for validation and then propagated for forward prediction of temperatures at borehole locations. This research will be beneficial for study pertaining to nuclear waste management, anthropological cycle predictions and geophysical featuresKeywords: Bayesian long-short term memory neural network, borehole temperature, ground surface temperature history, paleoclimate cycle
Procedia PDF Downloads 1282063 Fem Models of Glued Laminated Timber Beams Enhanced by Bayesian Updating of Elastic Moduli
Authors: L. Melzerová, T. Janda, M. Šejnoha, J. Šejnoha
Abstract:
Two finite element (FEM) models are presented in this paper to address the random nature of the response of glued timber structures made of wood segments with variable elastic moduli evaluated from 3600 indentation measurements. This total database served to create the same number of ensembles as was the number of segments in the tested beam. Statistics of these ensembles were then assigned to given segments of beams and the Latin Hypercube Sampling (LHS) method was called to perform 100 simulations resulting into the ensemble of 100 deflections subjected to statistical evaluation. Here, a detailed geometrical arrangement of individual segments in the laminated beam was considered in the construction of two-dimensional FEM model subjected to in four-point bending to comply with the laboratory tests. Since laboratory measurements of local elastic moduli may in general suffer from a significant experimental error, it appears advantageous to exploit the full scale measurements of timber beams, i.e. deflections, to improve their prior distributions with the help of the Bayesian statistical method. This, however, requires an efficient computational model when simulating the laboratory tests numerically. To this end, a simplified model based on Mindlin’s beam theory was established. The improved posterior distributions show that the most significant change of the Young’s modulus distribution takes place in laminae in the most strained zones, i.e. in the top and bottom layers within the beam center region. Posterior distributions of moduli of elasticity were subsequently utilized in the 2D FEM model and compared with the original simulations.Keywords: Bayesian inference, FEM, four point bending test, laminated timber, parameter estimation, prior and posterior distribution, Young’s modulus
Procedia PDF Downloads 2832062 Inversion of the Spectral Analysis of Surface Waves Dispersion Curves through the Particle Swarm Optimization Algorithm
Authors: A. Cerrato Casado, C. Guigou, P. Jean
Abstract:
In this investigation, the particle swarm optimization (PSO) algorithm is used to perform the inversion of the dispersion curves in the spectral analysis of surface waves (SASW) method. This inverse problem usually presents complicated solution spaces with many local minima that make difficult the convergence to the correct solution. PSO is a metaheuristic method that was originally designed to simulate social behavior but has demonstrated powerful capabilities to solve inverse problems with complex space solution and a high number of variables. The dispersion curve of the synthetic soils is constructed by the vertical flexibility coefficient method, which is especially convenient for soils where the stiffness does not increase gradually with depth. The reason is that these types of soil profiles are not normally dispersive since the dominant mode of Rayleigh waves is usually not coincident with the fundamental mode. Multiple synthetic soil profiles have been tested to show the characteristics of the convergence process and assess the accuracy of the final soil profile. In addition, the inversion procedure is applied to multiple real soils and the final profile compared with the available information. The combination of the vertical flexibility coefficient method to obtain the dispersion curve and the PSO algorithm to carry out the inversion process proves to be a robust procedure that is able to provide good solutions for complex soil profiles even with scarce prior information.Keywords: dispersion, inverse problem, particle swarm optimization, SASW, soil profile
Procedia PDF Downloads 1852061 Exploring Individual Decision Making Processes and the Role of Information Structure in Promoting Uptake of Energy Efficient Technologies
Authors: Rebecca J. Hafner, Daniel Read, David Elmes
Abstract:
The current research applies decision making theory in order to address the problem of increasing uptake of energy-efficient technologies in the market place, where uptake is currently slower than one might predict following rational choice models. Specifically, in two studies we apply the alignable/non-alignable features effect and explore the impact of varying information structure on the consumers’ preference for standard versus energy efficient technologies. As researchers in the Interdisciplinary centre for Storage, Transformation and Upgrading of Thermal Energy (i-STUTE) are currently developing energy efficient heating systems for homes and businesses, we focus on the context of home heating choice, and compare preference for a standard condensing boiler versus an energy efficient heat pump, according to experimental manipulations in the structure of prior information. In Study 1, we find that people prefer stronger alignable features when options are similar; an effect which is mediated by an increased tendency to infer missing information is the same. Yet, in contrast to previous research, we find no effects of alignability on option preference when options differ. The advanced methodological approach used here, which is the first study of its kind to randomly allocate features as either alignable or non-alignable, highlights potential design effects in previous work. Study 2 is designed to explore the interaction between alignability and construal level as an explanation for the shift in attentional focus when options differ. Theoretical and applied implications for promoting energy efficient technologies are discussed.Keywords: energy-efficient technologies, decision-making, alignability effects, construal level theory, CO2 reduction
Procedia PDF Downloads 3302060 Fragility Analysis of a Soft First-Story Building in Mexico City
Authors: Rene Jimenez, Sonia E. Ruiz, Miguel A. Orellana
Abstract:
On 09/19/2017, a Mw = 7.1 intraslab earthquake occurred in Mexico causing the collapse of about 40 buildings. Many of these were 5- or 6-story buildings with soft first story; so, it is desirable to perform a structural fragility analysis of typical structures representative of those buildings and to propose a reliable structural solution. Here, a typical 5-story building constituted by regular R/C moment-resisting frames in the first story and confined masonry walls in the upper levels, similar to the collapsed structures on the 09/19/2017 Mexico earthquake, is analyzed. Three different structural solutions of the 5-story building are considered: S1) it is designed in accordance with the Mexico City Building Code-2004; S2) then, the column dimensions of the first story corresponding to S1 are reduced, and S3) viscous dampers are added at the first story of solution S2. A number of dynamic incremental analyses are performed for each structural solution, using a 3D structural model. The hysteretic behavior model of the masonry was calibrated with experiments performed at the Laboratory of Structures at UNAM. Ten seismic ground motions are used to excite the structures; they correspond to ground motions recorded in intermediate soil of Mexico City with a dominant period around 1s, where the structures are located. The fragility curves of the buildings are obtained for different values of the maximum inter-story drift demands. Results show that solutions S1 and S3 give place to similar probabilities of exceedance of a given value of inter-story drift for the same seismic intensity, and that solution S2 presents a higher probability of exceedance for the same seismic intensity and inter-story drift demand. Therefore, it is concluded that solution S3 (which corresponds to the building with soft first story and energy dissipation devices) can be a reliable solution from the structural point of view.Keywords: demand hazard analysis, fragility curves, incremental dynamic analyzes, soft-first story, structural capacity
Procedia PDF Downloads 1782059 A Relative Entropy Regularization Approach for Fuzzy C-Means Clustering Problem
Authors: Ouafa Amira, Jiangshe Zhang
Abstract:
Clustering is an unsupervised machine learning technique; its aim is to extract the data structures, in which similar data objects are grouped in the same cluster, whereas dissimilar objects are grouped in different clusters. Clustering methods are widely utilized in different fields, such as: image processing, computer vision , and pattern recognition, etc. Fuzzy c-means clustering (fcm) is one of the most well known fuzzy clustering methods. It is based on solving an optimization problem, in which a minimization of a given cost function has been studied. This minimization aims to decrease the dissimilarity inside clusters, where the dissimilarity here is measured by the distances between data objects and cluster centers. The degree of belonging of a data point in a cluster is measured by a membership function which is included in the interval [0, 1]. In fcm clustering, the membership degree is constrained with the condition that the sum of a data object’s memberships in all clusters must be equal to one. This constraint can cause several problems, specially when our data objects are included in a noisy space. Regularization approach took a part in fuzzy c-means clustering technique. This process introduces an additional information in order to solve an ill-posed optimization problem. In this study, we focus on regularization by relative entropy approach, where in our optimization problem we aim to minimize the dissimilarity inside clusters. Finding an appropriate membership degree to each data object is our objective, because an appropriate membership degree leads to an accurate clustering result. Our clustering results in synthetic data sets, gaussian based data sets, and real world data sets show that our proposed model achieves a good accuracy.Keywords: clustering, fuzzy c-means, regularization, relative entropy
Procedia PDF Downloads 2592058 Analysis of Tourism Development Level and Research on Improvement Strategies - Take Chongqing as an Example
Abstract:
As a member of the tertiary industry, tourism is an important driving factor for urban economic development. As a well-known tourist city in China, according to statistics, the added value of tourism and related industries in 2022 will reach 106.326 billion yuan, a year-on-year increase of 1.2%, accounting for 3.7% of the city's GDP. However, the overall tourism development level of Chongqing is seriously unbalanced, and the tourism strength of the main urban area is much higher than that of the southeast Chongqing, northeast Chongqing and the surrounding city tourism area, and the overall tourism strength of the other three regions is relatively balanced. Based on the estimation of tourism development level and the geographic detector method, this paper finds that the important factors affecting the tourism development level of non-main urban areas in Chongqing are A-level tourist attractions. Through GIS geospatial analysis technology and SPSS data correlation research method, the spatial distribution characteristics and influencing factors of A-level tourist attractions in Chongqing were quantitatively analyzed by using data such as geospatial data cloud, relevant documents of Chongqing Municipal Commission of Culture and Tourism Development, planning cloud, and relevant statistical yearbooks. The results show that: (1) The spatial distribution of tourist attractions in non-main urban areas of Chongqing is agglomeration and uneven. (2) The spatial distribution of A-level tourist attractions in non-main urban areas of Chongqing is affected by ecological factors, and the degree of influence is in the order of water factors> topographic factors > green space factors.Keywords: tourist attractions, geographic detectors, quantitative research, ecological factors, GIS technology, SPSS analysis
Procedia PDF Downloads 132057 Increasing Sustainability Using the Potential of Urban Rivers in Developing Countries with a Biophilic Design Approach
Authors: Mohammad Reza Mohammadian, Dariush Sattarzadeh, Mir Mohammad Javad Poor Hadi Hosseini
Abstract:
Population growth, urban development and urban buildup have disturbed the balance between the nature and the city, and so leading to the loss of quality of sustainability of proximity to rivers. While in the past, the sides of urban rivers were considered as urban green space. Urban rivers and their sides that have environmental, social and economic values are important to achieve sustainable development. So far, efforts have been made at various scales in various cities around the world to revitalize these areas. On the other hand, biophilic design is an innovative design approach in which attention to natural details and relation to nature is a fundamental concept. The purpose of this study is to provide an integrated framework of urban design using the potential of urban rivers (in order to increase sustainability) with a biophilic design approach to be used in cities in developing countries. The methodology of the research is based on the collection of data and information from research and projects including a study on biophilic design, investigations and projects related to the urban rivers, and a review of the literature on sustainable urban development. Then studying the boundary of urban rivers is completed by examining case samples. Eventually, integrated framework of urban design, to design the boundaries of urban rivers in the cities of developing countries is presented regarding the factors affecting the design of these areas. The result shows that according to this framework, the potential of the river banks is utilized to increase not only the environmental sustainability but also social, economic and physical stability with regard to water, light, and the usage of indigenous materials, etc.Keywords: urban rivers, biophilic design, urban sustainability, nature
Procedia PDF Downloads 2882056 Influence of Hydrophobic Surface on Flow Past Square Cylinder
Authors: S. Ajith Kumar, Vaisakh S. Rajan
Abstract:
In external flows, vortex shedding behind the bluff bodies causes to experience unsteady loads on a large number of engineering structures, resulting in structural failure. Vortex shedding can even turn out to be disastrous like the Tacoma Bridge failure incident. We need to have control over vortex shedding to get rid of this untoward condition by reducing the unsteady forces acting on the bluff body. In circular cylinders, hydrophobic surface in an otherwise no-slip surface is found to be delaying separation and minimizes the effects of vortex shedding drastically. Flow over square cylinder stands different from this behavior as separation can takes place from either of the two corner separation points (front or rear). An attempt is made in this study to numerically elucidate the effect of hydrophobic surface in flow over a square cylinder. A 2D numerical simulation has been done to understand the effects of the slip surface on the flow past square cylinder. The details of the numerical algorithm will be presented at the time of the conference. A non-dimensional parameter, Knudsen number is defined to quantify the slip on the cylinder surface based on Maxwell’s equation. The slip surface condition of the wall affects the vorticity distribution around the cylinder and the flow separation. In the numerical analysis, we observed that the hydrophobic surface enhances the shedding frequency and damps down the amplitude of oscillations of the square cylinder. We also found that the slip has a negative effect on aerodynamic force coefficients such as the coefficient of lift (CL), coefficient of drag (CD) etc. and hence replacing the no slip surface by a hydrophobic surface can be treated as an effective drag reduction strategy and the introduction of hydrophobic surface could be utilized for reducing the vortex induced vibrations (VIV) and is found as an effective method in controlling VIV thereby controlling the structural failures.Keywords: drag reduction, flow past square cylinder, flow control, hydrophobic surfaces, vortex shedding
Procedia PDF Downloads 3752055 Effects of Planned Pre-laboratory Discussion on Physics Students’ Acquisition of Science Process Skills in Kontagora, Niger State
Authors: Akano Benedict Ubawuike
Abstract:
This study investigated the effects of pre-laboratory discussion on physics students’ acquisition of science process skills. The study design was quasi-experimental and purposive sampling technique was applied in selecting two schools in Kontagora Town for the research based on the availability of a good physics laboratory. Intact classes already grouped by the school for the sake of small laboratory space and equipment, comprising Thirty (30) students, 15 for experimental group in School A and 15 for control in school B were the subjects for the research. The instrument used for data collection was the lesson prepared for pre – practical discussion and researcher made Science Process Skill Test (SPST ) and two (2) research questions, and two (2) research hypotheses were developed to guide the study. The data collected were analyzed using means and t-Test statistics at 0.05 level of significance. The study revealed that pre-laboratory discussion was found to be more efficacious in enhancing students’ acquisition of science process skills. It also revealed that gender, had no significant effect on students’ acquisition of science process skills. Based on the findings, it was recommended among others that teachers should encourage students to develop interest in practical activities by engaging them in pre-laboratory discussion and providing instructional materials that will challenge them to be actively involved during practical lessons. It is also recommended that Ministries of Education and professional organizations like Science Teachers' Association of Nigeria (STAN) should organize workshops, seminars and conferences for physics teachers and Physics concepts should be taught with practical activity so that the students will do science instead of learning about science.Keywords: physics, laboratory, discussion, students, acquisition, science process skills
Procedia PDF Downloads 1312054 On the Effect of Carbon on the Efficiency of Titanium as a Hydrogen Storage Material
Authors: Ghazi R. Reda Mahmoud Reda
Abstract:
Among the metal that forms hydride´s, Mg and Ti are known as the most lightweight materials; however, they are covered with a passive layer of oxides and hydroxides and require activation treatment under high temperature ( > 300 C ) and hydrogen pressure ( > 3 MPa) before being used for storage and transport applications. It is well known that small graphite addition to Ti or Mg, lead to a dramatic change in the kinetics of mechanically induced hydrogen sorption ( uptake) and significantly stimulate the Ti-Hydrogen interaction. Many explanations were given by different authors to explain the effect of graphite addition on the performance of Ti as material for hydrogen storage. Not only graphite but also the addition of a polycyclic aromatic compound will also improve the hydrogen absorption kinetics. It will be shown that the function of carbon addition is two-fold. First carbon acts as a vacuum cleaner, which scavenges out all the interstitial oxygen that can poison or slow down hydrogen absorption. It is also important to note that oxygen favors the chemisorption of hydrogen, which is not desirable for hydrogen storage. Second, during scavenging of the interstitial oxygen, the carbon reacts with oxygen in the nano and microchannel through a highly exothermic reaction to produce carbon dioxide and monoxide which provide the necessary heat for activation and thus in the presence of carbon lower heat of activation for hydrogen absorption which is observed experimentally. Furthermore, the product of the reaction of hydrogen with the carbon oxide will produce water which due to ball milling hydrolyze to produce the linear H5O2 + this will reconstruct the primary structure of the nanocarbon to form secondary structure, where the primary structure (a sheet of carbon) are connected through hydrogen bonding. It is the space between these sheets where physisorption or defect mediated sorption occurs.Keywords: metal forming hydrides, polar molecule impurities, titanium, phase diagram, hydrogen absorption
Procedia PDF Downloads 3622053 Cirrhosis Mortality Prediction as Classification using Frequent Subgraph Mining
Authors: Abdolghani Ebrahimi, Diego Klabjan, Chenxi Ge, Daniela Ladner, Parker Stride
Abstract:
In this work, we use machine learning and novel data analysis techniques to predict the one-year mortality of cirrhotic patients. Data from 2,322 patients with liver cirrhosis are collected at a single medical center. Different machine learning models are applied to predict one-year mortality. A comprehensive feature space including demographic information, comorbidity, clinical procedure and laboratory tests is being analyzed. A temporal pattern mining technic called Frequent Subgraph Mining (FSM) is being used. Model for End-stage liver disease (MELD) prediction of mortality is used as a comparator. All of our models statistically significantly outperform the MELD-score model and show an average 10% improvement of the area under the curve (AUC). The FSM technic itself does not improve the model significantly, but FSM, together with a machine learning technique called an ensemble, further improves the model performance. With the abundance of data available in healthcare through electronic health records (EHR), existing predictive models can be refined to identify and treat patients at risk for higher mortality. However, due to the sparsity of the temporal information needed by FSM, the FSM model does not yield significant improvements. To the best of our knowledge, this is the first work to apply modern machine learning algorithms and data analysis methods on predicting one-year mortality of cirrhotic patients and builds a model that predicts one-year mortality significantly more accurate than the MELD score. We have also tested the potential of FSM and provided a new perspective of the importance of clinical features.Keywords: machine learning, liver cirrhosis, subgraph mining, supervised learning
Procedia PDF Downloads 1342052 Physics-Informed Convolutional Neural Networks for Reservoir Simulation
Authors: Jiangxia Han, Liang Xue, Keda Chen
Abstract:
Despite the significant progress over the last decades in reservoir simulation using numerical discretization, meshing is complex. Moreover, the high degree of freedom of the space-time flow field makes the solution process very time-consuming. Therefore, we present Physics-Informed Convolutional Neural Networks(PICNN) as a hybrid scientific theory and data method for reservoir modeling. Besides labeled data, the model is driven by the scientific theories of the underlying problem, such as governing equations, boundary conditions, and initial conditions. PICNN integrates governing equations and boundary conditions into the network architecture in the form of a customized convolution kernel. The loss function is composed of data matching, initial conditions, and other measurable prior knowledge. By customizing the convolution kernel and minimizing the loss function, the neural network parameters not only fit the data but also honor the governing equation. The PICNN provides a methodology to model and history-match flow and transport problems in porous media. Numerical results demonstrate that the proposed PICNN can provide an accurate physical solution from a limited dataset. We show how this method can be applied in the context of a forward simulation for continuous problems. Furthermore, several complex scenarios are tested, including the existence of data noise, different work schedules, and different good patterns.Keywords: convolutional neural networks, deep learning, flow and transport in porous media, physics-informed neural networks, reservoir simulation
Procedia PDF Downloads 1432051 The Effects of Water Fraction and Salinity on Crude Oil-Water Dispersions
Authors: Ramin Dabirian, Yi Zhang, Ilias Gavrielatos, Ram Mohan, Ovadia Shoham
Abstract:
Oil-water emulsions can be found in almost every part of the petroleum industry, namely in reservoir rocks, drilling cuttings circulation, production in wells, transportation pipelines, surface facilities and refining process. However, it is necessary for oil production and refinery engineers to resolve the petroleum emulsion problems as well as to eliminate the contaminants in order to meet environmental standards, achieve the desired product quality and to improve equipment reliability and efficiency. A state-of-art Dispersion Characterization Rig (DCR) has been utilized to investigate crude oil-distilled water dispersion separation. Over 80 experimental tests were ran to investigate the flow behavior and stability of the dispersions. The experimental conditions include the effects of water cuts (25%, 50% and 75%), NaCl concentrations (0, 3.5% and 18%), mixture flow velocities (0.89 and 1.71 ft/s), and also orifice place types on the separation rate. The experimental data demonstrate that the water cut can significantly affects the separation time and efficiency. The dispersion with lower water cut takes longer time to separate and have low separation efficiency. The medium and lower water cuts will result in the formation of Mousse emulsion and the phase inversion happens around the medium water cut. The data also confirm that increasing the NaCl concentration in aqueous phase can increase the crude oil water dispersion separation efficiency especially at higher salinities. The separation profile for dispersions with lower salt concentrations has a lower sedimentation rate slope before the inflection point. Dispersions in all tests with higher salt concentrations have a larger sedimenting rate. The presence of NaCl can influence the interfacial tension gradients along the interface and it plays a role in avoiding the Mousse emulsion formation.Keywords: oil-water dispersion, separation mechanism, phase inversion, emulsion formation
Procedia PDF Downloads 1812050 [Keynote Talk]: Mathematical and Numerical Modelling of the Cardiovascular System: Macroscale, Mesoscale and Microscale Applications
Authors: Aymen Laadhari
Abstract:
The cardiovascular system is centered on the heart and is characterized by a very complex structure with different physical scales in space (e.g. micrometers for erythrocytes and centimeters for organs) and time (e.g. milliseconds for human brain activity and several years for development of some pathologies). The development and numerical implementation of mathematical models of the cardiovascular system is a tremendously challenging topic at the theoretical and computational levels, inducing consequently a growing interest over the past decade. The accurate computational investigations in both healthy and pathological cases of processes related to the functioning of the human cardiovascular system can be of great potential in tackling several problems of clinical relevance and in improving the diagnosis of specific diseases. In this talk, we focus on the specific task of simulating three particular phenomena related to the cardiovascular system on the macroscopic, mesoscopic and microscopic scales, respectively. Namely, we develop numerical methodologies tailored for the simulation of (i) the haemodynamics (i.e., fluid mechanics of blood) in the aorta and sinus of Valsalva interacting with highly deformable thin leaflets, (ii) the hyperelastic anisotropic behaviour of cardiomyocytes and the influence of calcium concentrations on the contraction of single cells, and (iii) the dynamics of red blood cells in microvasculature. For each problem, we present an appropriate fully Eulerian finite element methodology. We report several numerical examples to address in detail the relevance of the mathematical models in terms of physiological meaning and to illustrate the accuracy and efficiency of the numerical methods.Keywords: finite element method, cardiovascular system, Eulerian framework, haemodynamics, heart valve, cardiomyocyte, red blood cell
Procedia PDF Downloads 2522049 A Case of Survival with Self-Draining Haemopericardium Secondary to Stabbing
Authors: Balakrishna Valluru, Ruth Suckling
Abstract:
A 16 year old male was found collapsed on the road following stab injuries to the chest and abdomen and was transported to the emergency department by ambulance. On arrival in the emergency department the patient was breathless and appeared pale. He was maintaining his airway with spontaneous breathing and had a heart rate of 122 beats per minute with a blood pressure of 83/63 mmHg. He was resuscitated initially with three units of packed red cells. Clinical examination identified three incisional wounds each measuring 2 cm. These were in the left para-sternal region, right infra-scapular region and left upper quadrant of the abdomen. The chest wound over the left parasternal area at the level of 4tth intercostal space was bleeding intermittently on leaning forwards and was relieving his breathlessness intermittently. CT imaging was performed to characterize his injuries and determine his management. CT scan of chest and abdomen showed moderate size haemopericardium with left sided haemopneumothorax. The patient underwent urgent surgical repair of the left ventricle and left anterior descending artery. He recovered without complications and was discharged from the hospital. This case highlights the fact that the potential to develop a life threatening cardiac tamponade was mitigated by the left parasternal stab wound. This injury fortuitously provided a pericardial window through which the bleeding from the injured left ventricle and left anterior descending artery could drain into the left hemithorax providing an opportunity for timely surgical intervention to repair the cardiac injuries.Keywords: stab, incisional, haemo-pericardium, haemo-pneumothorax
Procedia PDF Downloads 2022048 Architecture for QoS Based Service Selection Using Local Approach
Authors: Gopinath Ganapathy, Chellammal Surianarayanan
Abstract:
Services are growing rapidly and generally they are aggregated into a composite service to accomplish complex business processes. There may be several services that offer the same required function of a particular task in a composite service. Hence a choice has to be made for selecting suitable services from alternative functionally similar services. Quality of Service (QoS)plays as a discriminating factor in selecting which component services should be selected to satisfy the quality requirements of a user during service composition. There are two categories of approaches for QoS based service selection, namely global and local approaches. Global approaches are known to be Non-Polynomial (NP) hard in time and offer poor scalability in large scale composition. As an alternative to global methods, local selection methods which reduce the search space by breaking up the large/complex problem of selecting services for the workflow into independent sub problems of selecting services for individual tasks are coming up. In this paper, distributed architecture for selecting services based on QoS using local selection is presented with an overview of local selection methodology. The architecture describes the core components, namely, selection manager and QoS manager needed to implement the local approach and their functions. Selection manager consists of two components namely constraint decomposer which decomposes the given global or workflow level constraints in local or task level constraints and service selector which selects appropriate service for each task with maximum utility, satisfying the corresponding local constraints. QoS manager manages the QoS information at two levels namely, service class level and individual service level. The architecture serves as an implementation model for local selection.Keywords: architecture of service selection, local method for service selection, QoS based service selection, approaches for QoS based service selection
Procedia PDF Downloads 4262047 Evaluating the Relationship between Neighbourhood Satisfaction and Urban Safety: The Case Study of Riverwood, Sydney
Authors: Samaneh Arasteh
Abstract:
Neighbourhood satisfaction and safety are the two main components of urban life and have a substantial impact on residents’ quality of life. The relationship between these two components, especially in areas surrounding our individual private dwellings, is highly influential on many social, economic, and wellbeing activities that may benefit neighbourhood residents. Neighbourhood and urban design – which are liable to be affected by the perceived quality of local public spaces – are likely to be significant factors influencing broader residents’ feelings of safety. With this in mind, this study reviews recent normative literature on how these design processes have influenced neighbourhood satisfaction including perceived safety with a focus on different aspects of public spaces including planning, management, and design in a mix-tenure neighbourhood. Following the study aim, Riverwood in Sydney’s southwest was chosen as a case study to gain a detailed understanding of the context by engaging with community members, residents, non-government organisations, and experts. Moreover, archival studies on neighbourhood satisfaction and safety, expert interviews, and resident questionnaires are presented to shed light on the relationship between neighbourhood satisfaction and perception of safety. The study argues that for the safer neighbourhood in urban areas, social-cultural factors need to be aligned toward strengthening physical factors and since making the environments safer, it is important to understand practical and achievable mechanisms which are required to improve existing estates. Findings show that increasing the clarity of community social and physical environmental involvements can promote residents’ feelings of safety and following neighbourhood satisfaction.Keywords: neighbourhood satisfaction, public space, Riverwood, urban safety
Procedia PDF Downloads 1812046 Finite Element Modeling of a Lower Limb Based on the East Asian Body Characteristics for Pedestrian Protection
Authors: Xianping Du, Runlu Miao, Guanjun Zhang, Libo Cao, Feng Zhu
Abstract:
Current vehicle safety standards and human body injury criteria were established based on the biomechanical response of Euro-American human body, without considering the difference in the body anthropometry and injury characteristics among different races, particularly the East Asian people with smaller body size. Absence of such race specific design considerations will negatively influence the protective performance of safety products for these populations, and weaken the accuracy of injury thresholds derived. To resolve these issues, in this study, we aim to develop a race specific finite element model to simulate the impact response of the lower extremity of a 50th percentile East Asian (Chinese) male. The model was built based on medical images for the leg of an average size Chinese male and slightly adjusted based on the statistical data. The model includes detailed anatomic features and is able to simulate the muscle active force. Thirteen biomechanical tests available in the literature were used to validate its biofidelity. Using the validated model, a pedestrian-car impact accident taking place in China was re-constructed computationally. The results show that the newly developed lower leg model has a good performance in predicting dynamic response and tibia fracture pattern. An additional comparison on the fracture tolerance of the East Asian and Euro-American lower limb suggests that the current injury criterion underestimates the degree of injury of East Asian human body.Keywords: lower limb, East Asian body characteristics, traffic accident reconstruction, finite element analysis, injury tolerance
Procedia PDF Downloads 2892045 Fire and Explosion Consequence Modeling Using Fire Dynamic Simulator: A Case Study
Authors: Iftekhar Hassan, Sayedil Morsalin, Easir A Khan
Abstract:
Accidents involving fire occur frequently in recent times and their causes showing a great deal of variety which require intervention methods and risk assessment strategies are unique in each case. On September 4, 2020, a fire and explosion occurred in a confined space caused by a methane gas leak from an underground pipeline in Baitus Salat Jame mosque during Night (Esha) prayer in Narayanganj District, Bangladesh that killed 34 people. In this research, this incident is simulated using Fire Dynamics Simulator (FDS) software to analyze and understand the nature of the accident and associated consequences. FDS is an advanced computational fluid dynamics (CFD) system of fire-driven fluid flow which solves numerically a large eddy simulation form of the Navier–Stokes’s equations for simulation of the fire and smoke spread and prediction of thermal radiation, toxic substances concentrations and other relevant parameters of fire. This study focuses on understanding the nature of the fire and consequence evaluation due to thermal radiation caused by vapor cloud explosion. An evacuation modeling was constructed to visualize the effect of evacuation time and fractional effective dose (FED) for different types of agents. The results were presented by 3D animation, sliced pictures and graphical representation to understand fire hazards caused by thermal radiation or smoke due to vapor cloud explosion. This study will help to design and develop appropriate respond strategy for preventing similar accidents.Keywords: consequence modeling, fire and explosion, fire dynamics simulation (FDS), thermal radiation
Procedia PDF Downloads 2252044 Faceless Women: The Blurred Image of Women in Film on and Off-Screen
Authors: Ana Sofia Torres Pereira
Abstract:
Till this day, women have been underrepresented and stereotyped both in TV and Cinema Screens all around the World. While women have been gaining a different status and finding their own voice in the work place and in society, what we see on-screen is still something different, something gender biased, something that does not show the multifaceted identities a woman might have. But why is this so? Why are we stuck on this shallow vision of women on-screen? According to several cinema industry studies, most film screenwriters in Hollywood are men. Women actually represent a very low percentage of screenwriters. So why is this relevant? Could the underrepresentation of women screenwriters in Hollywood be affecting the way women are written, and as a result, are depicted in film? Films are about stories, about people, and if these stories are continuously told through a man’s gaze, is that helping in the creation of a gender imbalance towards women? On the other hand, one of the reasons given for the low percentage of women screenwriters is: women are said to be better at writing specific genres, like dramas and comedies, and not as good writing thrillers and action films, so, as women seem to be limited in the genres they can write, they are undervalued and underrepresented as screenwriters. It seems the gender bias and stereotype isn’t saved exclusively for women on-screen, but also off-screen and behind the screen. So film appears to be a men’s world, on and off-screen, and since men seem to write the majority of scripts, it might be no wonder that women have been written in a specific way and depicted in a specific way on-screen. Also, since films are a mass communication medium, maybe this over-sexualization and stereotyping on-screen is indoctrinating our society into believing this bias is alive and well, and thus targeting women off-screen as well (ergo, screenwriters). What about at the very begging of film? In the Silent Movies and Early Talkies era, women dominated the screenwriting industry. They wrote every genre, and the majority of scripts were written by women, not men. So what about then? How were women depicted in films then? Did women screenwriters, in an era that was still very harsh on women, use their stories and their power to break stereotypes and show women in a different light, or did they carry on with the stereotype, did they continue it and standardize it? This papers aims to understand how important it is to have more working women screenwriters in order to break stereotypes regarding the image of women on and off-screen. How much can a screenwriter (male or female) influence our gaze on women (on and off-screen)?Keywords: cinema, gender bias, stereotype, women on-screen, women screenwriters
Procedia PDF Downloads 3482043 The Effects of L2 Storybook Reading and Interactive Vocabulary Instruction on Vocabulary Acquisition
Authors: Lenore Van Den Berg
Abstract:
Vocabulary development is positively associated with reading development, reading comprehension, and academic achievement. It is frequently stated that South Africa is in the midst of a literacy crisis. The past 24 years since the first democratically elected government have not revolutionised the education system; rather, after various curriculum changes and continued struggles to incorporate all 11 official languages as languages of instruction, research shows that 78 per cent of South African Grade 4 learners are functionally illiterate. The study sets out to find solutions to this problem and to add to the research base on vocabulary acquisition by assessing the effect of integrating the principles of explicit, interactive vocabulary instruction, within the context of storybook reading, on Grade 1 vocabulary acquisition. Participants comprised of 69 Grade 1 English second language learners from three classes in two government primary schools. The two schools differ in socio-economic status (SES), with School A having a lower SES than School B. One Grade 1 class was randomly assigned to be the Experimental Group, while two other classes served as control groups. The intervention took place for a period of 18 weeks and consisted of 30-minute storybook reading sessions, accompanied by interactive vocabulary instruction, twice a week. The Peabody Picture Vocabulary Test IV (PPVT-IV) was the diagnostic test administered to all learners before the intervention, as a pre-test, and after the interventions as a post-test. Data regarding excising vocabulary instruction practices and approaches were also collected through classroom observations and individual, semi-structured interviews with the Experimental Group’s teacher. Findings suggest that second language storybook reading, accompanied by explicit, interactive vocabulary instruction, have a positive impact on Grade 1 vocabulary acquisition but that vocabulary teaching practices and socio-economic status also play a key role in vocabulary acquisition.Keywords: interactive vocabulary instruction, second language vocabulary, storybook reading, vocabulary acquisition, reading development, PPVT
Procedia PDF Downloads 872042 Energy-Saving Methods and Principles of Energy-Efficient Concept Design in the Northern Hemisphere
Authors: Yulia A. Kononova, Znang X. Ning
Abstract:
Nowadays, architectural development is getting faster and faster. Nevertheless, modern architecture often does not meet all the points, which could help our planet to get better. As we know, people are spending an enormous amount of energy every day of their lives. Because of the uncontrolled energy usage, people have to increase energy production. As energy production process demands a lot of fuel sources, it courses a lot of problems such as climate changes, environment pollution, animals’ distinction, and lack of energy sources also. Nevertheless, nowadays humanity has all the opportunities to change this situation. Architecture is one of the most popular fields where it is possible to apply new methods of saving energy or even creating it. Nowadays we have kinds of buildings, which can meet new willing. One of them is energy effective buildings, which can save or even produce energy, combining several energy-saving principles. The main aim of this research is to provide information that helps to apply energy-saving methods while designing an environment-friendly building. The research methodology requires gathering relevant information from literature, building guidelines documents and previous research works in order to analyze it and sum up into a material that can be applied to energy-efficient building design. To mark results it should be noted that the usage of all the energy-saving methods applied to a design project of building results in ultra-low energy buildings that require little energy for space heating or cooling. As a conclusion it can be stated that developing methods of passive house design can decrease the need of energy production, which is an important issue that has to be solved in order to save planet sources and decrease environment pollution.Keywords: accumulation, energy-efficient building, storage, superinsulation, passive house
Procedia PDF Downloads 2622041 Impact of a Novel Technique of S-Shaped Tracheostoma in Pediatric Tracheostomy in Intensive Care Unit on Success and Procedure Related Complications
Authors: Devendra Gupta, Sushilk K. Agarwal, Amit Kesari, P. K. Singh
Abstract:
Objectives: Pediatric patients often may experience persistent respiratory failure that requires tracheostomy placement in Pediatric ICU. We have designed a technique of tracheostomy in pediatric patients with S-shaped incision on the tracheal wall with higher success rate and lower complication rate. Technique: Following general anesthesia and positioning of the patient, the trachea was exposed in midline by a vertical skin incision. In order to make S-shaped tracheostoma, second tracheal ring was identified. The conventional vertical incision was made in second tracheal ring and then extended at both its ends laterally in the inter-cartilaginous space parallel to the tracheal cartilage in the opposite direction to make the incision S-shaped. The trachea was dilated with tracheal dilator and appropriate size of tracheostomy tube was then placed into the trachea. Results: S-shaped tracheostomy was performed in 20 children with mean age of 6.25 years (age range is 2-7) requiring tracheostomy placement. The tracheostomy tubes were successfully placed in all the patients in single attempt. There was no incidence of significant intra-operative bleeding, subcutaneous emphysema, vocal cord palsy or pneumothorax. Two patients developed pneumonia and expired within a year. However, there was no incidence of tracheo-esophageal fistula, suprastomal collapse or difficulty in decannulation on one year of follow up related to our technique. One patient developed late trachietis managed conservatively. Conclusion: S-shaped tracheoplasty was associated with high success rate, reduced risk of the early and late complications in pediatric patients requiring tracheostomy.Keywords: peatrics, tracheostomy, ICU, tracheostoma
Procedia PDF Downloads 2642040 On Lie-Central Derivations and Almost Inner Lie-Derivations of Leibniz Algebras
Authors: Natalia Pacheco Rego
Abstract:
The Liezation functor is a map from the category of Leibniz algebras to the category of Lie algebras, which assigns a Leibniz algebra to the Lie algebra given by the quotient of the Leibniz algebra by the ideal spanned by the square elements of the Leibniz algebra. This functor is left adjoint to the inclusion functor that considers a Lie algebra as a Leibniz algebra. This environment fits in the framework of central extensions and commutators in semi-abelian categories with respect to a Birkhoff subcategory, where classical or absolute notions are relative to the abelianization functor. Classical properties of Leibniz algebras (properties relative to the abelianization functor) were adapted to the relative setting (with respect to the Liezation functor); in general, absolute properties have the corresponding relative ones, but not all absolute properties immediately hold in the relative case, so new requirements are needed. Following this line of research, it was conducted an analysis of central derivations of Leibniz algebras relative to the Liezation functor, called as Lie-derivations, and a characterization of Lie-stem Leibniz algebras by their Lie-central derivations was obtained. In this paper, we present an overview of these results, and we analyze some new properties concerning Lie-central derivations and almost inner Lie-derivations. Namely, a Leibniz algebra is a vector space equipped with a bilinear bracket operation satisfying the Leibniz identity. We define the Lie-bracket by [x, y]lie = [x, y] + [y, x] , for all x, y . The Lie-center of a Leibniz algebra is the two-sided ideal of elements that annihilate all the elements in the Leibniz algebra through the Lie-bracket. A Lie-derivation is a linear map which acts as a derivative with respect to the Lie-bracket. Obviously, usual derivations are Lie-derivations, but the converse is not true in general. A Lie-derivation is called a Lie-central derivation if its image is contained in the Lie-center. A Lie-derivation is called an almost inner Lie-derivation if the image of an element x is contained in the Lie-commutator of x and the Leibniz algebra. The main results we present in this talk refer to the conditions under which Lie-central derivation and almost inner Lie-derivations coincide.Keywords: almost inner Lie-derivation, Lie-center, Lie-central derivation, Lie-derivation
Procedia PDF Downloads 1362039 Application of a Model-Free Artificial Neural Networks Approach for Structural Health Monitoring of the Old Lidingö Bridge
Authors: Ana Neves, John Leander, Ignacio Gonzalez, Raid Karoumi
Abstract:
Systematic monitoring and inspection are needed to assess the present state of a structure and predict its future condition. If an irregularity is noticed, repair actions may take place and the adequate intervention will most probably reduce the future costs with maintenance, minimize downtime and increase safety by avoiding the failure of the structure as a whole or of one of its structural parts. For this to be possible decisions must be made at the right time, which implies using systems that can detect abnormalities in their early stage. In this sense, Structural Health Monitoring (SHM) is seen as an effective tool for improving the safety and reliability of infrastructures. This paper explores the decision-making problem in SHM regarding the maintenance of civil engineering structures. The aim is to assess the present condition of a bridge based exclusively on measurements using the suggested method in this paper, such that action is taken coherently with the information made available by the monitoring system. Artificial Neural Networks are trained and their ability to predict structural behavior is evaluated in the light of a case study where acceleration measurements are acquired from a bridge located in Stockholm, Sweden. This relatively old bridge is presently still in operation despite experiencing obvious problems already reported in previous inspections. The prediction errors provide a measure of the accuracy of the algorithm and are subjected to further investigation, which comprises concepts like clustering analysis and statistical hypothesis testing. These enable to interpret the obtained prediction errors, draw conclusions about the state of the structure and thus support decision making regarding its maintenance.Keywords: artificial neural networks, clustering analysis, model-free damage detection, statistical hypothesis testing, structural health monitoring
Procedia PDF Downloads 2082038 Exploring Community Benefits Frameworks as a Tool for Addressing Intersections of Equity and the Green Economy in Toronto's Urban Development
Authors: Cheryl Teelucksingh
Abstract:
Toronto is in the midst of an urban development and infrastructure boom. Population growth and concerns about urban sprawl and carbon emissions have led to pressure on the municipal and the provincial governments to re-think urban development. Toronto’s approach to climate change mitigation and adaptation has positioning of the emerging green economy as part of the solution. However, the emerging green economy many not benefit all Torontonians in terms of jobs, improved infrastructure, and enhanced quality of life. Community benefits agreements (CBAs) are comprehensive, negotiated commitments, in which founders and builders of major infrastructure projects formally agree to work with community interest groups based in the community where the development is taking place, toward mutually beneficial environmental and labor market outcomes. When community groups are equitably represented in the process, they stand not only to benefit from the jobs created from the project itself, but also from the longer-term community benefits related to the quality of the completed work, including advocating for communities’ environmental needs. It is believed that green employment initiatives in Toronto should give greater consideration to best practices learned from community benefits agreements. Drawing on the findings of a funded qualitative study in Toronto (Canada), “The Green Gap: Toward Inclusivity in Toronto’s Green Economy” (2013-2016), this paper examines the emergent CBA in Toronto in relation to the development of a light rail transit project. Theoretical and empirical consideration will be given to the research gaps around CBAs, the role of various stakeholders, and discuss the potential for CBAs to gain traction in the Toronto’s urban development context. The narratives of various stakeholders across Toronto’s green economy will be interwoven with a discussion of the CBA model in Toronto and other jurisdictions.Keywords: green economy in Toronto, equity, community benefits agreements, environmental justice, community sustainability
Procedia PDF Downloads 3422037 Media Representation of Romanian Migrants in the Italian Media: A Comparative Study
Authors: Paula-Catalina Meirosu
Abstract:
The economic migration (intra-EU) is a topic of debate in the public space in both countries of origin and countries of destination. Since the 1990s, after the collapse of communist regimes and then the accession of some former communist countries to the EU, the migratory flows of migrants (including Romanian migrants) to EU countries has been increased constantly. Italy is one of the main countries of destination among Romanians since at the moment Italy hosts more than one million Romanian migrants. Based on an interdisciplinary analytical framework focused on the theories in the field of transnationalism, media and migration studies and critical media analysis, this paper investigates the media construction of intra-EU economic migration in the Italian press from two main perspectives. The first point of view is the media representation of Romanian migrants in the Italian press in a specific context: the EU elections in 2014. The second one explores the way in which Romanian journalists use the media in the destinations countries (such as Italy) as a source to address the issue of migration. In this context, the paper focuses on online articles related to the Romanian migrants’ representation in the media before and during the EU elections in two newspapers (La Repubblica from Italy and Adevarul from Romania), published during January-May 2014. The methodology is based on a social-constructivist approach, predominantly discursive and includes elements of critical discourse analysis (CDA) to identify the patterns of Romanian migrants in the Italian press as well as strategies for building categories, identities, and roles of migrants. The aim of such an approach is to find out the dynamic of the media discourse on migration from a destination country in the light of a European electoral context (EU elections) and based on the results, to propose scenarios for the elections to be held this year.Keywords: migration, media discourse, Romanian migrants, transnationalism
Procedia PDF Downloads 1342036 “Lightyear” – The Battle for LGBTQIA+ Representation Behind Disney/Pixar’s Failed Blockbuster
Authors: Ema Vitória Fonseca Lavrador
Abstract:
In this work, we intend to explore the impact that the film "Lightyear" (2022) had on the social context of its production, distribution, and reception. This film, produced by Walt Disney Animation Studios and Pixar Animation Studios, depicts the story of Buzz Lightyear, a Space Ranger from which the character of the same name in the "Toy Story" film franchise is based. This prequel was predicted to be the blockbuster of the year, but it was a financial fiasco and the subject of numerous controversies, which also caused it to be drowned out by the film "Minions: The Rise of Gru" (2022). The reason for its failure is not based on the film's narrative or quality but on its controversial context for being a commitment to LGBTQIA+ representation in an unexpected way, by featuring a same-sex couple and showing a kiss shared by them. This representation cost Disney distribution in countries against LGBTQIA+ representation in media and involved Disney in major disagreements with fans and politicians, especially for being a direct opposition to the Florida House Bill 1557, also called the “Don't Say Gay” bill. Many major companies have taken a stand against this law because it jeopardizes the safety of the LGBTQIA+ community, and, although Disney initially cut the kiss off the film, pressure from the staff and audience resulted in unprecedented progress. For featuring a brief homosexual kiss, its exhibition was banned in several countries and discouraged by the same public that was previously the focus of Disney's attention, as this is a conservative “family-friendly” branded company. We believe it is relevant to study the case of "Lightyear" because it is a work that raises awareness and promotes representation of communities affected during the dark times while less legislation is being approved to protect the rights and safety of queer people.Keywords: Don’t Say Gay” bill, gender stereotypes, LGBTQIA+ representation, lightyear, Disney/Pixar
Procedia PDF Downloads 812035 The Greek Revolution Through the Foreign Press: The Case of Newspaper the London Times in the Period 1821-1828
Authors: Euripides Antoniades
Abstract:
In 1821, the Greek Revolution movement, under the political influence that arose from the French revolution, and the corresponding movements in Italy, Germany and America, demanded the liberation of the nation and the establishment of an independent national state. Published topics in the British press regarding the Greek Revolution, focused on: a)the right of the Greeks to claim their freedom from Turkish domination in order to establish an independent state based on the principle of national autonomy, b)criticism regarding Turkish rule as illegal and the power of the Ottoman Sultan as arbitrary, c)the recognition of the Greek identity and its distinction from the Turkish one and d)the endorsement Greeks as the descendants of ancient Greeks. The London Times is a print publication that presents, in chronological or thematic order, the news, opinions or announcements about the most important events that have occurred in a place during a specified period of time. A combination of qualitative and quantitative content analysis was applied. An attempt was made to record Greek Revolution references along with the usage of specific words and expressions that contribute to the representation of the historical events and their exposure to the reading public. Key finds of this research reveal that a)there was a frequency of passionate daily articles concerning the events in Greece, their length, and context in The Times of London, b)he British public opinion was influenced by this particular newspaper and c) he newspaper published various news about the revolution by adopting the role of animator of the Greek struggle. In fact, this type of news was the main substance of the The London Times’ structure, establishing a positive image about the Greek Revolution contributing to the European diplomatic development. These factors offered a change in the attitude of the British and Russians respectively assuming a positive approach towards Greece.Keywords: Greece, revolution, press, the london times, great britain, mass media
Procedia PDF Downloads 86